2 min read

Aidan Gomez

Aidan Gomez

QuHarrison Terry presents Aidan Gomez, CEO of Cohere, with the WTF Innovators Award for his foundational contributions to LLM architectures, and enabling enterprises worldwide to easily develop and deploy LLMs within their existing products.

The WTF Innovators Award recognizes excellence at the precipice of societal change, with the inaugural class focusing on AI innovators. As a memento, each of the 34 awardees are gifted a featured song by QuHarrison Terry and Genesis Renji. We present “Coherent”, produced by Nimso, to Aidan Gomez.

Cohere enables businesses and developers of all sizes to integrate generative AI features into their apps, platforms, and products. They are focused on enterprise use cases, working with customers to train custom LLMs based on their data and allowing them to deploy on any cloud provider they currently use.

Cohere’s models power interactive chat features, generate text for product descriptions and articles, capture the meaning of text for search, content moderation, and intent recognition. Their tech enables businesses to explore, generate, and search for information in a new way that’s intuitive and natural.

In June 2023, Cohere raised a $270M Series C funding round at a $2.2B valuation. They have customers including Oracle, Salesforce, Spotify, Jasper, BambooHR, among others.

They’ve partnered with Oracle to embed generative AI into a number of its products, while also being the go-to AI provider for Oracle’s customers. In return, Cohere uses Oracle’s cloud to train, build, and deploy its own AI models.

Not only did Aidan co-author the seminal paper on Transformers which ignited the revolution in generative AI, but he continues to contribute to AI research while running a billion-dollar AI company. You can sense his genuine enthusiasm for making massive neural networks more efficient, and getting them deployed at scale in the real world. – QuHarrison Terry.

Prior to founding Cohere, Aidan Gomez led a team of researchers on machine learning projects at FOR.ai. During his time as a Student Researcher at Google Brain in 2017, he co-authored the paper “Attention Is All You Need” which introduced Transformer networks, spurring much of the modern innovation in Generative AI. He’s co-authored a number of other research papers on GANs, Transformers, Neural Networks, and self-attention for language models.