![]() While Google did not lay out how many parameters PaLM 2 is trained on - PaLM was trained on 540 billion language parameters - Google promises that PaLM 2 will have improved capabilities in addition to faster and more efficient performance. According to Google, PaLM 2 has already been secretly powering Google Bard and has many future integrations planned. Google is setting up PaLM 2 to be the fundamental AI model behind its future AI products.
0 Comments
Leave a Reply. |