Google’s new AI can play 3D games like real people.

Google’s DeepMind, the AI research unit of Google, has developed a new program to learn how to play 3D games by listening to instructions in natural language. The Scalable Instructable Multiworld Agent (SIMA) program can recognize and understand various game environments and perform actions accordingly to achieve goals based on instructions. It uses an advanced AI model to map images and language accurately, predict what will happen next on the screen, and take action accordingly.

Google collaborated with eight-game developers, including Hello Games and Coffee Stain, to train SIMA. The program underwent testing with popular titles such as No Man’s Sky, Teardown, Valheim, and Goat Simulator 3. SIMA has about 600 basic skills, including turning left, climbing stairs, opening menus, and using maps. However, it cannot execute more complex tasks like searching for resources and building camps.

As per Google DeepMind, SIMA is still in the early research stages, and it is uncertain whether it will be available for widespread use soon. Nonetheless, the development of SIMA marks an important milestone in AI, demonstrating AI’s potential to learn and perform complex tasks in virtual environments. It opens up many possibilities for future applications, particularly in the gaming industry. Furthermore, it indicates that AI is becoming increasingly intelligent and capable of handling complex tasks, promising to bring many new advancements in various fields in the near future.

Related posts

GTA 6 is guaranteed to launch on time, Take-Two quashes delay rumors

Be wary of SteelFox malware attacking Windows using a copyright-cracking tool

Apple chose Foxconn and Lenovo to develop an AI server based on Apple Silicon