«Design review and visualization steering using the INQUISITIVE interaction toolkit Lakshmi Sastry, David Boyd and Michael Wilson Information ...»
(not final paper)
Design review and visualization steering using the INQUISITIVE
Lakshmi Sastry, David Boyd and Michael Wilson
Information Technology Department
CLRC Rutherford Appleton Laboratory,
Chilton, Didcot, OX11 0QX
As novel technologies are absorbed into the conventional systems development methodology
they usually pass through three phases: firstly, they are demonstrated in specialised stand alone systems; secondly they are demonstrated in rapid prototype environments used to elicit requirements from users unfamiliar with the potential of the technology in their domain; and thirdly, system representations and documentation for each contractually important development stage, and methods for moving between them are defined incorporating best practice. As knowledge based technologies did in the late 1980’s, so Virtual Reality (VR) technologies appear now to be moving from the second to the third of these phases, and establishing manageable system development practices that can be subject to general contractual obligations (Wilson et al, 1988).
The current market for VR development tools ranges from public domain toolkits to high cost prototyping and development environments. Equally, some tools are continuing to address a generic range of VR applications, while others are becoming more focused to one application.
For example, being linked as a real time interaction environment to a 3D CAD modelling tool for engineering design. In this market, VR developers have the choice of expensively maintaining the skills to support a range of tools to meet broad customer requirements, cheaply exploiting a single tool very well but thereby limiting their market, or developing their own custom toolkit layer that can be applied to a range of delivery vehicles, thereby meeting the needs of a wide customer base while also limiting the skills required by their developers (usually implementers) to a single toolkit. The additional cost of this third option is that it usually requires development overheads calling on systems level skills that do not overlap with the VR designer’s. An additional benefit of this approach in other technologies that have joined the systems development mainstream, is that such custom toolkit layers usually define API’s that drive the development of market tools and form the basis of standards (Duce et al., 1997).
At the cusp of the second and third stages of technology development there is a need for VR to be amenable to both the representations used throughout a software lifecycle in order to incorporate it into conventional development, and for the definition of general toolkits to meet market demands for implementations. But what is the novelty of VR technology that is being incorporated into conventional systems development ?
In the purist demonstrations of VR, the virtual world models the real world through geometry, object appearance, behaviour and interaction, experienced through full (possibly multi-modal) Extended Abstract (not final paper) immersion to achieve presence (Marsh & Wright, 2000). At the other extreme, real time interaction with 3D graphics on 2D screens may be all that is required to achieve sufficient engagement with the application to achieve the user’s task. This scale mirrors that between real world widget metaphors in 2D user interfaces, and purely conventional devices such as pull down menus – even real world 2D metaphors show obvious limitations such as the cultural specificity of mail box designs. In both 2D and VR interfaces, some metaphors promote transfer of learning or domain skill resulting in faster assimilation of the computer application and sometimes better task performance. Where an application sits on this scale depends on the requirements of the users’ task for the conventional usability considerations (Jacob Neilsen UE REF). When such requirements are clearly collected and represented, then their mapping to a design solution in VR is still not fully understood, hence the need still for rapid prototyping of designs in development environments to validate the design against the users (perhaps unstated) requirements and allow quick experimentation. We therefore need rapid prototyping tools which will support quick, cost effective prototype development and eventual successful application of 3D VR interaction techniques to a wide range of virtual environments.
This paper reports such a rapid prototyping tool as part of the development of a general systems development method for VR intended to incorporate VR technology into mainstream system development.
The INQUISITIVE VR development method The INQUISITIVE development method provides representations for each of the stages in the conventional development cycle that capture the interaction required within VR.
The INQUISITIVE method does not address the geometry of objects in the virtual world, nor their appearance since these issues are addressed by graphical components in conventional development methods.
The INQUISITIVE method consists of five main components:
1) The interactive capture and representation of interaction requirements supported by a tool (implemented in Java for portability) producing a scene-graph like representation (Smith et al, 2000).
2) A Hybrid design representation which can be used to construct a design from the requirements that is amenable to the formal proof of interaction properties stated in the requirements (Smith et al, 1999).
3) An HCI analysis of the human cognitive and perceptual resources used in the interaction in VR which draws on the Hybrid design representation and provides guidance on design trade-offs (Marsh & Smith, 2000).
4) The mapping of the design to an implementation supported by a Java tool that translates the Hybrid design representation into code stubs for Maverick (Smith et al, 2000, Smith & Duke, 2000).
5) The Interaction Toolkit to provide a single implementation that can drive different VR development environments to meet the non-functional requirements as well as the functional and VR interaction ones.
Extended Abstract (not final paper) Since the first four stages of the method developed at the University of York have been thoroughly reported elsewhere, the remainder of this paper will consider the Interaction Toolkit, starting with the local requirements for it, then covering its architecture and the API it provides for the VR designer; before considering demonstrations of its application; concluding with its evaluation, and further work to be done using it to provide further guidance on the choice of interaction techniques for tasks.
The method includes representations that can be used as the basis of a model based approach to development, so that libraries can be used to decompose requirements, and guide design, but these models are not yet populated.
Proof of conformance of a design to general usability guidelines can be undertaken from the Hybrid representation and ISO HCI standards included in contracts.
Application and Toolkit Requirements Our customers for VR applications are scientists and engineers (Sastry & Boyd, 1999) who have two main applications. Firstly, they wish to design and construct buildings and apparatus using 3D CAD engineering tools and undertake group design reviews using interactive real time navigation of them in VR. Secondly, they wish to visualise the data arising from scientific experiments and control that visualisation, maybe changing parameters of the visualisation process, or even steering the experiment generating the data in real time.
The users’ objective is to achieve their task goals, and they are only open to using VR when it can be shown to help achieve that objective more effectively, or efficiently than alternative means. They are not interested in subsidising the use of VR for either publicity purposes, or to further the development of VR technology itself. One of the advantages to the users of VR is that it allows them to do unreal things that they could not do in the real world, such as measuring between unreachable locations, or moving immovable objects. However, even in these unreal cases, the interaction in immersive or semi-immersive 3D may facilitate a speed of navigation, a precision of interaction, or a perception resulting in insight that are unattainable otherwise.
These users are experts in the real world domain tasks, and familiar with the real world objects and actions in their domains, but they are also highly computer literate in their own specialist tools, and accustomed to many computer domain user interface conventions. For the visualisation task, the data and its relationships have no real world representation to imitate in a virtual reality, therefore most aspects of the visualisation scene are either domain conventions or even just conventions of a previous computer representation. Consequently, the users are open to the full range of realism, and VR interaction from full presence to 2D interaction with 3D graphics.
The style of use of different user groups varies considerably; some users wish to distribute VR applications to large communities to be used around the world, so they wish to use public domain code. Others require fully certified and supported development and interaction environments that will conform to the quality control constraints on general contracts for the development of multi-million Euro systems (e.g. satellites). Therefore no single VR interaction environment will meet these requirements, so we currently use both MAVERIK from Manchester University, UK as a public domain tool and Parametric Technology's dvMockup VR kernels.
Extended Abstract (not final paper)
Local variations also exist in the requirements that cannot easily be met by both these environments requiring the development of further interaction components. For example, in one application where a public domain interaction tool is required, users wish to use an eyelevel viewpoint for gross navigation around buildings and equipment, but wish to place the viewpoint at a fixed location for detailed study of experimental behaviour. Such facilities are provided by the large development environment, but not the public domain interaction tool, so a common level of implementation of these is required to meet the user requirements so as not to lock designs into the capabilities of a single development environment.
INQUISITIVE Interaction Toolkit
The interaction toolkit will improve support for developing user interaction within taskoriented virtual environment applications. Analysis of the interaction technique has led us to a modular design for the toolkit with defined interfaces to input devices and existing commercial and public domain VR system kernels. The interaction toolkit is being developed to provide application developers and human factors researchers with a portable toolkit of interaction techniques for navigation, selection and manipulation within virtual environments.
All application tasks, however complex, can be implemented in terms of a combination of tasks from the four basic classes of user interaction - navigation, selection, manipulation and data input, in virtual environments. In 3D, positioning is extended to include defining orientation in 3D space.
Each basic interaction task can be realised using a number of possible interaction techniques.
For example movement can be implemented using the magic carpet or point-fly techniques.
Each application will identify one or more interaction techniques appropriate for carrying out the tasks required in that application. This in turn will guide the definition of the interaction processes needed to realise those techniques. The application tasks can then be achieved by suitable combinations of these interaction processes.
Based on the above analysis, the main functional components which the toolkit must provide
• a set of interaction processes for the four classes of basic interaction tasks;
• a set of generic virtual interaction objects such as toolbox;
We are using several components of the UML notation to help us transform our high level specification of the toolkit into a detailed and implementable design (Boyd & Sastry, 1999).
Figure 1 depicts the detailed architecture of the interaction toolkit and how it maps on to the two VR kernels we are using. It also shows several components of the UML based analysis for interaction techniques.
Table 1 shows the classes of application objects in the interaction toolkit. All classes of interaction object in the interaction toolkit allow VR application designers to change appearance of widgets.
Extended Abstract (not final paper)
Extended Abstract (not final paper) When instantiated as application objects the interaction objects can be: distance measuring tools, meters to read values of temperature, radiation etc.. from database underlying CAD model etc.. In the later case the display on a meter can be on window pane or on hand held display moving with the user depending on which interaction object is chosen. This flexibility in instantiating application objects in different ways shows the power of the interaction toolkit to both meet interaction requirements, and to allow their investigation through rapid prototyping when required.
Toolkit Demonstrations A sample set of interaction techniques will be presented (demonstrated) which include application/user centred navigational and/or object manipulation, real-time interactive editing, querying and steering of the virtual world.
The testbed demonstrator applications include an engineering design review with an architectural walk-through and a visualisation and three-dimensional browsing of Cluster-II satellite data implemented using the interaction toolkit.
Design Review Demonstration Engineers need to construct a new building, particle beam target, and experiments on beam lines off the target. Each component is being developed by different teams throughout Europe.