hacker / leftist / shitposter

Mastodon: @drjenkem@mastodon.blugatch.tube

Matrix: @drjenkem:matrix.org

  • 0 Posts
  • 135 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle



















  • Depends on what you mean by general intelligence. I’ve seen a lot of people confuse Artificial General Intelligence and AI more broadly. Even something as simple as the K-nearest neighbor algorithm is artificial intelligence, as this is a much broader topic than AGI.

    Well, I mean the ability to solve problems we don’t already have the solution to. Can it cure cancer? Can it solve the p vs np problem?

    And by the way, wikipedia tags that second definition as dubious as that is the definition put fourth by OpenAI, who again, has a financial incentive to make us believe LLMs will lead to AGI.

    Not only has it not been proven whether LLMs will lead to AGI, it hasn’t even been proven that AGIs are possible.

    If some task can be represented through text, an LLM can, in theory, be trained to perform it either through fine-tuning or few-shot learning.

    No it can’t. If the task requires the LLM to solve a problem that hasn’t been solved before, it will fail.

    I can’t pass the bar exam like GPT-4 did

    Exams often are bad measures of intelligence. They typically measure your ability to consume, retain, and recall facts. LLMs are very good at that.

    Ask an LLM to solve a problem without a known solution and it will fail.

    We can interact with physical objects in ways that GPT-4 can’t, but it is catching up. Plus Stephen Hawking couldn’t move the same way that most people can either and we certainly wouldn’t say that he didn’t have general intelligence.

    The ability to interact with physical objects is very clearly not a good test for general intelligence and I never claimed otherwise.