

Are you trying to anger the gods? Below the fixed stars, there are seven great gods and their lights in the Heavens. And each has its appropriate day. Which two gods shall you so tempt their wrath by denying their day?
Are you trying to anger the gods? Below the fixed stars, there are seven great gods and their lights in the Heavens. And each has its appropriate day. Which two gods shall you so tempt their wrath by denying their day?
I want a smart phone built into a shoe. Get Smart style.
LLMs as tools,
Yes, in the same way that buying a CD from the store, ripping to your hard drive, and returning the CD is a tool.
It’s not about hampering proliferation, it’s about breaking the hype bubble. Some of the western AI companies have been pitching to have hundreds of billions in federal dollars devoted to investing in new giant AI models and the gigawatts of power needed to run them. They’ve been pitching a Manhattan Project scale infrastructure build out to facilitate AI, all in the name of national security.
You can only justify that kind of federal intervention if it’s clear there’s no other way. And this story here shows that the existing AI models aren’t operating anywhere near where they could be in terms of efficiency. Before we pour hundreds of billions into giant data center and energy generation, it would behoove us to first extract all the gains we can from increased model efficiency. The big players like OpenAI haven’t even been pushing efficiency hard. They’ve just been vacuuming up ever greater amounts of money to solve the problem the big and stupid way - just build really huge data centers running big inefficient models.
There’s nothing wrong with the fundamentals of the technology, just the applications that Westoids doggedly insist it be used for.
Westoids? Are you the type of guy I feel like I need to take a shower after talking to?
I love the fact that the same executives who obsess over return to office because WFH ruins their socialization and sexual harassment opportunities think think they’re going to be able to replace all their employees with AI. My brother in Christ. You have already made it clear that you care more about work being your own social club than you do actual output or profitability. You are NOT going to embrace AI. You can’t force an AI to have sex with you in exchange for keeping its job, and that’s the only trick you know!
There are many clear use cases that are solid, so AI is here to stay, that’s for certain. But how far can it go, and what will it require is what the market is gambling on.
I would disagree on that. There are a few niche uses, but OpenAI can’t even make a profit charging $200/month.
The uses seem pretty minimal as far as I’ve seen. Sure, AI has a lot of applications in terms of data processing, but the big generic LLMs propping up companies like OpenAI? Those seems to have no utility beyond slop generation.
Ultimately the market value of any work produced by a generic LLM is going to be zero.
How to address superintelligence, if that is actually something we realistically face:
Make creating an unlicensed AI with over a certain threshold to be a capital offense.
Regulate the field of artificial intelligence as heavily as we do nuclear science and nuclear weapons development.
Have strict international treaties on model size and capability limitations.
Have inspection regimes in place to allow international monitoring of any electricity usage over a certain threshold.
Use satellites to track anomalous large power use across the globe (monitored via waste heat) and thoroughly investigate any large unexplained energy use.
Target the fabs. High powered chips should be licensed and tracked like nuclear materials.
Make clear that a nuclear first strike is a perfectly acceptable response to a nation state trying to create AGI.
Anyone who says this technology simply cannot be regulated is a fool. We’re talking models that require hundreds of megawatts or more to run and giant data centers full of millions of dollars worth of chips. There’s only a handful of companies on the planet producing the hardware for these systems. The idea that we can’t regulate such a thing is ridiculous.
I’m sorry, but I put the survival of the human race above your silly science project. If I have to put every person on this planet with a degree in computer science into a hole in the ground to save the human race, that is a sacrifice I am willing to make. Hell, I’ll go full Dune and outlaw computers all together, go back to pen and paper for everything, before I condone AGI.
We can’t control this technology? Balderdash. It’s created by human beings. And human beings can be killed.
So, how do we deal with ASI? You put anyone trying to create it deep in the ground. This is self defense at a species level. Sacrificing a few thousand madmen who think they’re going to summon a benevolent god to serve them is simple self-defense. It’s OK to kill cultists who are trying to summon a demon.
You have to be willing to walk away from and ignore corporate media platforms, or else they’ll never be defeated. And content creators need to also learn to not post their stuff to these platforms.
Biden also announced he wasn’t enforcing the law. The TikTok operators saw the writing on the wall and realized they need to bend the knee to Trump.
Don’t get too hung up on specific dates. Laws are not some physical law like gravity that are present and universal. They exist within a fuzzy context of enforcement and interpretation.
Biden made clear he wasn’t going to enforce the law. Trump made clear he was going to make a decision based on how well Tiktok flattered and bribed him. So that’s exactly what they’ve done.
Due to two facts:
The samurai class in Japan officially lasted way later than you probably think
The earliest primitive fax machine existed much earlier than you probably think.
It is technically possible for Abraham Lincoln to have received a fax from a samurai.
There’s no evidence it ever happened, but it technically could have happened.
Your conscious mind does not experience reality directly.
Your conscious mind does not experience reality directly. There is no path going directly from your eyes to your conscious awareness. Rather, the subconscious collects sensory input. It uses that input to create a virtual simulacrum of the world, a big internal 3D model. That internal 3D representation is what you, the conscious part of your mind, actually interacts with and experiences.
You ever wonder how weird it is that people can have intense, debilitating hallucinations? Like schizophrenics seeing and hearing entirely fictional things. Have you ever seen a camera produce anything like that? A flash of light, a distorted image, dead pixels, etc? Sure, those kinds of errors cameras can produce. But a camera will never display a vivid realistic image of a person that wasn’t ever actually in their field of view.
Yet the human mind is capable of this. In the right circumstances, the human brain is capable of spawning entire fictional people into your conscious awareness. This shows that there is an elaborate subconscious processing layer between what our conscious mind observes and direct sensory input. Your conscious mind is basically experiencing a tiny little internal version of The Matrix, entirely generated on its own wetware. And this subconscious processing layer is what makes hallucinations possible. The processes that produce this internal simulation can become corrupted, and thus allows hallucinations.
This architecture is also what makes dreaming possible. If your conscious mind only perceived things upon direct sensory feedback from the eyes, ears, etc., how would dreaming be possible?
You are essentially experiencing reality through an elaborate 3d modeling version of an AI video generator.
Many will say that World War Three cannot happen, that nuclear weapons will prevent it. However, this assumes that World War Three has to be global thermonuclear war, rather than some repeat of the previous world wars.
Cities don’t have to be leveled for nations to fight a world war. The US fought two world wars, and we never had our cities and infrastructure decimated. What I can imagine is a future world war where all the major players fight the war in the same way the US fought the two previous wars. Both sides contribute massive resources, adopt wartime economies, throw their whole populations behind the effort etc, but at no point do the various combatants directly attack the main territory and population centers of the other side. You could have a conflict where both sides lost millions of troops fighting it out in some third party territory, but the nukes never fly as all sides realize that invading the home territory of the others is suicide.
Happy Day of Thor to you this fine afternoon!