Feedback

Chat Icon

Generative AI For The Rest Of US

Your Future, Decoded

From AI to Generative AI: Understanding the Magic Behind Our Machines
44%

Sam Altman's Bet: Scaling Up

The first and second AI winters were periods of diminished funding and interest in artificial intelligence research. Despite the field's existence for decades, it has only recently witnessed significant progress. What changed? One of the key factors is the availability of large-scale datasets and powerful computing resources. The advent of the internet and cloud computing has enabled the collection and processing of vast amounts of data.

In December 2015, a group of ambitious individuals, including Elon Musk, Sam Altman, and Greg Brockman, founded OpenAI, a research organization committed to developing artificial general intelligence (AGI) in a manner that benefits humanity. Even though Elon Musk left the board in 2018, OpenAI has made significant strides. Four years after its inception, the organization transitioned from a non-profit to a "capped" for-profit, with the profit being capped at 100 times any investment. Shortly after, OpenAI released GPT-3, which was a game changer. Generative AI existed long before GPT-3, but it was not as powerful as it is today. What made GPT-3 special? The answer is its scale. Sam Altman, the CEO of OpenAI, wagered that if you scale up the model, you get better results. And he was right. GPT-3 is a large model with 175 billion parameters, which allows it to learn from vast amounts of data and generate high-quality outputs. Some speculate that training GPT-3 cost OpenAI between $4 and $12 million. At this scale, using a powerful NVIDIA Tesla V100 GPU would take 355 years

Generative AI For The Rest Of US

Your Future, Decoded

Enroll now to unlock all content and receive all future updates for free.

Unlock now  $20.99$15.74

Hurry! This limited time offer ends in:

To redeem this offer, copy the coupon code below and apply it at checkout:

Learn More