![](/static/66c60d9f/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/c47230a8-134c-4dc9-89e8-75c6ea875d36.png)
However, Wikipedia editors consider Media Bias/Fact Check as “generally unreliable”, recommending against its use for what some see as breaking Wikipedia’s neutral point of view.
However, Wikipedia editors consider Media Bias/Fact Check as “generally unreliable”, recommending against its use for what some see as breaking Wikipedia’s neutral point of view.
Though errors are somewhat monitored by Retraction Watch.
Or as Dijkstra puts it: “asking whether a machine can think is as dumb as asking if a submarine can swim”.
Alan Turing puts it similarly, the question is nonsense. However, if you define “machine” and “thinking”, and redefine the question to mean: is machine thinking differentiable from human thinking; you can answer affirmatively, theoretically (rough paraphrasing). Though the current evidence suggests otherwise (e.g. AI learning from other AI drifts toward nonsense).
For more, see: Computing Machinery and Intelligence, and Turing’s original paper (which goes into the Imitation Game).
Yet use AI (possibly) to determine users’ AI answers.
Let’s extend this thought experiment a little. Consider just forum posts; the numbers will be somewhat similar for articles and other writings, as well as photos and videos.
A bot creates how many more posts than a human? Being (ridiculously) conservative, we’ll say 10x more.
On day one: 10 humans are posting (for simplicity’s sake) 10 times a day, totaling 100 posts. Bot is posting 100 a day. For a total of 200 human and bot posts; 50% of which are the bot.
In your (extended) example, at the end of a year: 10 humans are still posting 100 times a day. The 10 bots are posting a total of 1000 times a day. Bots are at 90%, humans 10%.
This statistic can lead you to think human participation in the Internet is difficult to find.
Returning to reality, consider how inhuman AI bots are, with each probably able to outpost humans by millions or billions of times under millions of aliases each. If you find search engines, articles, forums, reviews, and such are bonkers now, just wait a few years. Predicting general chaotic nonsense for the Internet is a rational conclusion, with very few islands of humanity. Unless bots are stopped.
Right now though, bots are increasing.
Exactly. A more accurate headline would be “Americans are Falling Behind on their Income.”
Yes, though in some locales there are “work crews” (slave labor) that clear brush, road litter, and such for businesses, organizations, the state, and individuals.
Some other folks just took the bus.
Here’s a non-recommended, non-standard, bad practice work-around:
This looks somewhat like a blank line in a browser, but who knows what’ll happen in other apps. (Click the “view source” icon for this example.)
After a bit of research, I’m forced by facts (NS records can be cached for an undetermined time) to see what you’re saying. Thank you for teaching me.
The workings are, of course, a bit more complicated than what either of us have said (here’s a taste), but there is a situation as you describe, where separating the registrar from the name servers, and the name servers from the domain, could save the domain from going down.
If a registrar goes out of business, ICANN transfers the domain(s) to another registrar.
If a name server business fails, you change name servers through your registrar.
You can’t really fix registrar services in your name server, nor name server problems through your registrar. (Unless, of course, your registrar is also your name server.)
It’s a decent testable hypothesis. If there were a center. Which seems obvious in the familiar mechanical way of say a firecracker. It certainly has a center with debris going every direction from that point.
However (to use a problematic oversimplification): what if the universe has a similarity to the surface of a balloon being blown up, where is the center?
Wherever you put your finger, the whole rest of the surface of the balloon is expanding away from that point. One center point is earth. Every other place in the universe also appears to be a center.
When looking at the evidence, data from telescopes and such, describing the expansion of the universe is closer to the balloon surface theory than the firecracker theory. Even though the firecracker theory is easier to comprehend.
Like, say, slow down an older phone so one has to buy a new faster phone? Source
Sounds like a job for Lenny bot. (There are samples on video sites).
This would be seriously useful, what are the impeccable primary sources?
The way the market works: You charge a competitive price that allows you to cover your costs and make a profit. If your product provides enough value to the buyer, they’ll pay for it.
That’s what’s taught. There’s quite a bit more in practice, including: what insurance companies learned from management consultants.
But they aren’t colluding to eek every ounce of money from people.
Maybe so, though there appears to be a common interest.
In 2016, HDDs were more reliable (MTBF).
In 2022, for the first 5 years, SSDs are looking more reliable. With more of a constant failure rate (1%/yr), than the increasing failure rate of HDDs after 5 years.
(Caveat: not just bit rot, but general failure data.)
Small enough to fit on a CD, which isn’t everyone’s definition of “small.” There are, of course, much smaller Linux distros, less than a tenth the size; particularly if CLI is adequate.
In addition to the Texas stand on Medicaid expansion, from the article:
Texas is “ground zero” for the Medicaid unwinding, Alker said. The state leads the U.S. in disenrollments, with around 1.7 million this year, according to KFF.
Huh, that’s so, it was there last January. It used to follow this paragraph (still there today anyway), which contains a similar criticism with citation:
So if those are considered fact-based, there’s no need to delve further.