ECP NetHappenings Eric Schmidt “Future is SCARY”

©1998 *Educational CyberPlayGround®
https://edu-cyberpg.com
Sign Up NetHappenings Blog News Headlines Email List
( ͡° ͜ʖ ͡°) sign up ©ECP  NetHappenings  Blog   https://cyberplayground.org
Follow Twitter @CyberPlayGround
©1993 https://k12playground.com
© https://RichAsHell.com

ECP NetHappenings  Eric Schmidt no longer works for google.

“Future is SCARY”

US vs. China for knowledge supremacy

Google CEO ERIC SCHMIDT BANNED Interview LEAKED: “Future is SCARY” (AI Pep Talk)

Make me a copy of TicTok. Steal all the users. Steal all the music, put my preferences in it, produce this program in the next 30 seconds, release it, and in one hour if it is not viral, do something different along the same lines. That’s the command.
If you can go from arbitrary language to arbitrary digital commands which is essentially a python command, imagine that each and every human on the planet has their own programmer that actually does what they want, as opposed to programmers that doesn’t.

Here are the key points from the interview:

Impact of LLMs: Schmidt believes that LLMs will have a significant impact on the world, even greater than social media. LLMs will be able to access and process information like the human brain and will be able to generate creative text formats, like different kinds of code.

Components of powerful LLMs: According to Schmidt, there are three key components that will make LLMs powerful. The first is context windows, which allow LLMs to access and remember short-term information. The second is text-to-action, which allows users to give instructions to LLMs in natural language and have them carry out those instructions. For example, a user could instruct an LLM to create a social media app similar to TikTok. The third component is the ability to learn and adapt, which allows LLMs to continuously improve their abilities.

Challenges and risks of LLMs: There are several challenges and risks associated with LLMs. One challenge is the amount of data and computing power required to train these models. Another challenge is the difficulty in understanding how LLMs arrive at their outputs. Additionally, there is a risk of LLMs being used to create misinformation or to automate tasks that could put people out of work.

The future of AI: Schmidt believes that there is a large investment bubble in AI right now, and that there will likely be a shakeout in the coming years. However, he also believes that AI has the potential to bring about significant positive changes in the world.
Overall, the interview with Eric Schmidt provides a thought-provoking perspective on the potential of LLMs and AI. While there are challenges and risks associated with this technology, Schmidt believes that the potential benefits outweigh the risks.

TImecodes 0:00
Intro
0:10 Eric Schmidt at Stanford University
0:40 AI Impact on the World
1:15 How Human Brain Works
1:47 US Banning Tik-Tok
2:05 How LLM is Dangerous
3:05 Options for Coding Assistance and AI Coder
4:00
Why Eric Invested in Small Companies
4:25 Eric on Sam Altman
4:40 Why Eric went to White House
4:55 Hydra Power – US, Canada, Arabs
6:30 Eric Schmidt’s Comment on Google That Got Him in Trouble
7:45 Elon Musk Behaviour
8:20
Physicists Work in Basement in Taiwan
9:10 Competition with China’s AI and AGI
10:20 US Banned Nvidia Chips
11:05 Ukraine War: $500 3d Printed Drones vs Russian Tanks
12:10 Robotic War
14:30 How to Break Existing AI Systems
17:50 My AI Investments
19:30 Mistral AI Investments
20:00 Lies of Eric Schmidt
22:15 Talk on Chemistry
23:00 Build Me a Google Competitor
23:10 Talk on Elections and Social Media
24:20 The Greatest Threat to Democracy
25:30 How LLMs will Replace Programmers/Researchers/Scientists
25:55 Top AI People Come from India
26:40
EU ACT
27:00 Most Important Parts of Interview

The Future Of AI, According To Former Google CEO Eric Schmidt
https://www.youtube.com/watch?v=DgpYiysQjeI

Eric Schmidt Full Controversial Interview on AI Revolution (Former Google CEO)
https://www.youtube.com/watch?v=mKVFNg3DEng

▓▓▓—▓▓▓—▓▓▓—▓▓▓—▓▓▓—▓▓▓