Fulbright
Web3D Projects
Haptics Projects
Augmented Reality Training Tools and Simulators
Airway Intubation Project [PDF]
Augmented Reality (AR) has been proposed for medical training simulation systems. The user visualizes objects in 3D and interacts with the real environment. The virtual 3D objects enhance (visually augment) the real environment. Methods associated with AR include camera calibration procedures, and dynamic superimposition procedures (to bring virtual objects in register with real objects). The methods we employ assume that the real object is defined by a cluster of markers dispersed on the object's surface. Moreover, a position/orientation tracking system provides the individual 3D location of the markers in the tracker frame of reference. Given the geometry of the virtual objects the transformation required to register real with virtual objects is determined. We have also investigated new algorithms for placing InfraRed Emitting Diodes (IREDs) on an irregular objects to improve the tracking capability of the system.

Augmented Reality Medical Training Tools [PDF]
Presented at MMVR 2003:
Airway Lungs
Optical superimposition of the 3D lungs model over a human patient simulator (HPS).

Marking Irregular Object's Position and Orientation
Marker Mapping Algorithms
The requirements for tracking in AR environments are stringent, driven by the need to register real and virtual objects. Answering the need to track real objects within these environments, I proposed two algorithms for markers (IREDs) distribution on complex rigid objects. The proposed algorithms employ an optimization technique with a spherical or cylindrical intermediary surface. The validity and effectiveness of the algorithms were tested heuristically by simulation. Important issues that surfaced in experiments are the type of markers (i.e. active vs. passive), the cones of emission for different type of active markers as well as the impact on the marker distribution and orientation.
  • The Quiescent Algorithm generates a uniform distribution for a specified number of markers on the object's surface. The number of markers is determined through an iterative process.
  • The ViewPoint Algorithm minimizes the number of markers while assuring that at least k markers are seen (detected) from different viewpoints. The number of markers required to determine an object's position and orientation from each viewpoint is dependent on the tracking system used (a frequent number is three).

Markers are distributed (Simulated Annealing) on a sphere.

Markers are distributed on the object's surface.
 

Markers and their cone of emission.

3D Object after marker mapping using viewpoint algorithm. White cubes represent the camera position.
 
The ARC Display
An Augmented Reality Visualization Center [MPG]
Presented at ISMAR 2002 Darmstadt, Germany. The ARC display represents a significant advancement towards our vision for a Multimodal Augmented Reality system with 3D visual, 3D audio and haptic capabilities. The display consists of a curved, retroreflective wall, a head-mounted projective display (HMPD), a commercially available optical tracking system, and a Linux-based PC. The HMPD takes advantage of a revolutionary set of light-weight optics, which allows for a 52° field of view with optics that weight just 8g per eye. Images are currently rendered on two 640×480 pixel LCDs encased in the HMPD, yielding a visual acuity of about 3.5 arc minutes.

ARC room interior (retroreflective walls).

ARC room exterior (cylindrical shape).
 
Distributed Virtual Environments for Medical Simulation and Training
Scene Synchronization for Distributed Mixed and Virtual Reality Systems [MPG]
An effective, distributed MR/VR collaborative environment that supports remote, real-time interactions would allow users to approach their medical, engineering, or scientific data as a team, but with each participant holding a unique perspective. This may lead to startling observations and enhanced creativity from the participants, as a result of the synergy that develops among a group of people working together as if they are physically present in a common place.
I have proposed a novel criterion for categorization of distributed MR/VR applications based on the action frequency patterns. Such a criterion will help distributed collaborative environments designers to better match the application attributes with the network (infrastructure) parameters.
In support of the proposed criterion, I have implemented and analyzed an adaptive synchronization algorithm which addresses the network latency problems in distributed MR/VR applications. The decentralized computation approach for the drift values improves the system scalability and its interactive behavior. The algorithm is highly efficient when the network "upshot frequency" is higher than the user's "action frequency." I believe in the widespread of such distributed applications as low latency optical networks and optical routing become increasingly available. (For more info please contact me or see associated publications.)

Synchronization algorithm disabled: The 3D crosses have different orientations.

Synchronization algorithm enabled: The 3D crosses have the same orientation.
 
Distributed Augmented Reality Training Tool
(AR Component [MPG] and Distributed Component [MPG])
A distributed training prototype that allows visualization of a 3D deformable lung model superimposed on a human patient simulator (HPS) at several remote trainee locations. At the same time, the 3D models are shared among a trainer and a few remote trainees. The prototype integrates
  • deformable 3D anatomical models (lung);
  • a distributed, interactive VE environment targeted towards medical prognostics and training;
  • synchronization module for state consistency maintenance across two and three nodes; and
  • novel optical see-through Head Mounted Displays.
(For more info please contact me or see associated publications.)

Distributed component: 3D lungs breathing in synch on remote nodes.

AR component: 3D lungs superimposed on the HPS.
 
Hybrid Nodes for Distributed Mixed Reality (MR) Applications
A distributed mixed-reality (MR) or virtual reality (VR) environment implies the cooperative engagement of a set of software and hardware resources. With the advances in sensors and computer networks we have seen an increase in the number of potential MR/VR applications that require large amounts of information from the real world collected through sensors (e.g., position and orientation tracking sensors). These sensors collect data from the real environment in real-time at different locations and a distributed environment connecting them must assure data distribution among collaborative sites at interactive speeds. With the advances in sensor technology, we envision that in future systems a significant amount of data will be collected from sensors and devices attached to the participating nodes.
We propose a new architecture for sensor based interactive distributed MR/VR environments that falls in-between the atomistic peer-to-peer model and the traditional client-server model. Each node is autonomous and fully manages its resources and connectivity. The dynamic behavior of the nodes is dictated by the human participants that manipulate the sensors attached to these nodes.
 
Distributed-Systems-Behavior Simulators
Interactive PetriNet Simulator
Presented at ISMAR 2002 Darmstadt, Germany. The ARC display represents a significant advancement towards our vision for a Multimodal Augmented Reality system with 3D visual, 3D audio and haptic capabilities. The display consists of a curved, retroreflective wall, a head-mounted projective display (HMPD), a commercially available optical tracking system, and a Linux-based PC. The HMPD takes advantage of a revolutionary set of light-weight optics, which allows for a 52° field of view with optics that weight just 8g per eye. Images are currently rendered on two 640×480 pixel LCDs encased in the HMPD, yielding a visual acuity of about 3.5 arc minutes.

Candy machine simulation.

Opening an XML file containing the PetriNet definition.
 
The application was developed using Java Standard Development Kit from Sun Microsystems. The application has the ability to run as a Java Applet or as a Java graphical application. When running in Applet mode, the application does not have access to the host system file structure for security reasons.
The IPS has a GUI that allows the user to interactively build a Petri Network and to simulate it's behavior. The GUI allows also to read the definition of a PertiNet form an XML file. The file is parsed and the network is draw on the application's frame. The user can run the simulation step by step while observing the network marking.

Middleware Research
Middleware—TSpaces
TSpaces is network middleware for the new age of ubiquitous computing. A succinct description of TSpaces would be, a network communication buffer with database capabilities. However, it is often easier to describe it in terms of what it does. It enables communication between applications and devices in a network of heterogeneous computers and operating systems. TSpaces provides group communication services, database services, URL-based file transfer services, and event notification services. It is implemented in the Java programming language and thus it automatically possesses network ubiquity through platform independence, as well as a standard type representation for all data types.
(Related links: T-Spaces, JXTA)
©2008–2018, Felix G. Hamza-Lup.
All Rights Reserved.