+ All Categories
Home > Documents > A RTIFICIAL I NTERACTIONS A ND C OLLISIONS I N A V IRTUAL E NVIRONMENT Bringing it, virtually.

A RTIFICIAL I NTERACTIONS A ND C OLLISIONS I N A V IRTUAL E NVIRONMENT Bringing it, virtually.

Date post: 29-Dec-2015
Category:
Upload: erin-caldwell
View: 213 times
Download: 0 times
Share this document with a friend
Popular Tags:
19
ARTIFICIAL INTERACTIONS AND COLLISIONS IN A VIRTUAL ENVIRONMENT Bringing it, virtually.
Transcript

ARTIFICIAL INTERACTIONS AND COLLISIONS IN A VIRTUAL ENVIRONMENTBringing it, virtually.

OUTLINE

Introduction Purpose and Motivation Background Related work

Project Details Design Decisions Demonstration Implementation Details

Conclusion Limitations Future work Q + A

PURPOSE AND MOTIVATION

Provide a framework for further development with the Lumiere Ghosting project Easily modifiable Artistically adaptable Allow for interactions between AI and human

users as an interface in the CompuObscura

BACKGROUND ON THE LUMIERE GHOSTING PROJECT

Project started in 2002 by Prof. Gillette of the English department; currently maintained and led by Prof. Gillette and Prof. Fowler of the architecture department

Designed to be curriculum for interactive media and information design courses New Media I: Narratives & Semiotics New Media II: Technologies & Construction New Media Projects: Synthesis and Performance

Project encourages students to apply what they learn and engage in the development of the CompuObscura, “an interactive new media theatre”

BACKGROUND ON THE LUMIERE GHOSTING PROJECT

RELATED WORK

Generic bots found in WoW, Counter Strike, etc. AI exists to oppose user

Second Life with human interaction in virtual environment No motion tracking, i.e., system reacts to user

input rather than user movement No generic bot behavior system

Search Assistant in text editors Rudimentary algorithms for suggesting words as

they’re typed—our project looks to more long term searching and goal achievement

Brainworks AI Complete re-write of Quake 3 AI

PROJECT REQUIREMENTS1. The environment should support multiple avatars.2. The user should be able to control the avatar

movements. 4. The user should be able to see or detect other

avatars in the vicinity of his avatar.5. The system will implement a set of modularized behaviors

and emotions that will be associated with the avatars. 6. The system will handle the collision based transfer of

avatar emotions and behaviors. 7. The artificial-intelligent behaviors of the avatar will be

affected by the modularized behaviors and emotion being exchanged.

8. The user must be able to determine the changes that happened to the avatar after a collision.

9. After a collision the avatar should be able to continue to move freely in the environment.

10. If multiple avatars collide all will be affected. 11. The avatars will be designed with the capability of

performing simple missions autonomously.

BEHAVIORAL CHARACTERISTICS

DESIGN DECISIONS

Goal: System capable of simulating first-person interactions

Bots should have “personalities” Bots should be able to exchange/mix-and-

match these “personalities” Previous implementations allowed for addition of

body parts Mixing through collision

DESIGN DECISIONS

System to base our work off Previous systems used proprietary software Open source?

Game engine the obvious choice Pre-existing code for 'Bots' Physics User controlled characters Common

DESIGN DECISIONS

Game Engines Source IOQuake3

We Chose IOQuake3 Free Open Source Runs on almost any modern system Built in inventory system Had all features we needed, minus custom AI

Which is what we wanted to do anyway :)

DEMONSTRATION

BACKGROUND ON IOQUAKE3

Server maintains state internally Clients send it updates on what the user does

(picks up items, etc) Server determines how clients interact Server controls bots

Server can be modified for our own ends Clients can remain the same Easy upgrades to the system

IMPLEMENTATION DETAILS

Bot has innate and acquired behavior IOQuake3 has bot characteristics

aggression, scaredness, ... We modify to happiness, sadness, excitedness

IOQuake3 has inventory control rocket launcher, BFG10K We have shown it possible to integrate with behaviors

Combination gives us innate and acquired behavior When bots collide, they trade 'weapons'

IMPLEMENTATION DETAILS

Server keeps a “bot state” for each bot Bot state contains 'AINode' Points to the appropriate AI function for each

state AINode_Afraid AINode_Violent …

Every 100 milliseconds, the AINode function is called

Every 2 seconds, the AINode variable is updated Server begins in default state based on

characteristic values

IMPLEMENTATION DETAILS

Collisions transfer characteristics between the bots

The next state is then calculated by comparing the values of each characteristic

Example: Violent=0.8, Passive=0.2 There would be about an 80% chance that the

next AINode state would be set to AINode_Violent.

LIMITATIONS

Almost none Open source project allows for any modifications Some problems may be non-trivial

Lack of documentation We started from a blank slate, looking at the

code

Lack of time Making a complete product with time we had Two behaviours remain unimplemented

FUTURE WORK

We developed the framework Future teams can put pieces together Need graphics team to “renovate” environment Give human interaction more meaning

Integration with motion tracking component of CompuObscura system Player control interaction Full character motion in-game

Integration with inventory control system Functional prototype shows viability of this

system

CPE480 – Fall 2008Computer Science DepartmentCal Poly State University, San Luis Obispo

QUESTIONS AND ANSWERS

AuthorsDaniel NelsonDaniel MedinaIsrael UrquizaAlexander Sideropoulos


Recommended