Ad
Short

Cerebras and Opentensor have trained a powerful 3 billion parameter language model with an 8k context length window, called BTLM-3B-8k-base, on the Condor Galaxy 1 (CG-1) supercomputer. This new model outperforms similar models, achieves performance comparable to open 7B parameter models, can be quantized to fit on devices with as little as 3 GB of memory, and is licensed for commercial use. It requires 71% fewer training FLOPs and has a 58% smaller memory footprint for inference than comparable 7B models.

Short

Warren Buffett expressed cautious fascination with artificial intelligence and ChatGPT in a CNBC interview. While admitting that he doesn't understand AI enough to invest in it, the legendary investor still sees its potential for everyday use. However, he noted that he's interested in understanding how the technology evolves into a tangible, valuable business. Buffett also acknowledged the risks associated with AI, comparing it to the invention of the atomic bomb and expressing uncertainty about its long-term benefits to humanity.

Ad
Ad
Ad
Ad
Ad
Ad
Google News