23: Testing a Proprietary Virtual AI inside of a Simulated Space and Comparing Different AI Systems

Vance Boatwright Co-Author
Simology.com
 
Robert Norton First Author
GASPgroup.Org
 
Robert Norton Presenting Author
GASPgroup.Org
 
Tuesday, Aug 5: 10:30 AM - 12:20 PM
1359 
Contributed Posters 
Music City Center 
Using proprietary hectic motion sensing techniques, we at Simology.com deliver AI learning tech that allows AI to learn motions and recreate all types of motion in a simulated environment. This presentation provides a demonstration of the tech itself, its uses, and a comparison to other AI interface models. To create a sentient AI capable of humanistic actions, AI must learn not only pixelated data but a range of sensations from hearing taste smell and also touch. At Simology we are currently teaching our virtual AI to process sounds with text-to-speech (tts).We intend to improve these speech capabilities using a motion analysis for sound transitions. Similarly, we are preparing a touch learning capability based on a pressure switch. Taste and smell functions for an AI will be learned with microfluidics and detecting the motion of liquids. Eventually these techniques will be used in robotics to design artificial animal behavior. We compare the advances of Simology virtual AI to other competitors such as WebSim and Claude. Our results indicate that Simology is capable of using hectic motion sensing tech using HTML protocols, and is able to produce superior models of movement.

Keywords

artificial intelligence

simulation

API support

Advanced physics engine

robotics

interface 

Main Sponsor

Section on Physical and Engineering Sciences