Currently,
active research and exploration of space is underway, including through the
International Space Station (ISS). The work of cosmonauts at the station involves
performing tasks not only inside the modules, but also on their surface
(extravehicular activity), which requires spacewalk [1, 2]. In this case,
cosmonaut attaches himself to spacecraft by means of two safety tethers cling
to the station's handrails. Some jobs involve a long walk from the exit of
space module to destination point. In the course of moving along the handrails,
cosmonaut must fix and unfix carbines of two tethers multiple times [3]. Therefore
there is a risk of unsuccessful carbine fixation or tether malfunction, in
which cosmonaut is detached from the station and cannot return. To solve this
problem, it is important to use cosmonaut rescue system (for example, Simplified
Aid For EVA Rescue
or abbreviated SAFER [4]), which represents a space jetpack
with thrusters, put on spacesuit. Manual or automatic control of the jetpack is
performed by special control panel. An important and actual task is to train
cosmonauts in terrestrial conditions in the skills of using rescue equipment.
In recent
years, an actively developing area is creation of training systems using virtual
reality technologies. These systems replace real objects with their virtual
prototypes, and training is performed by immersing the operator in virtual
environment. An example is simulators [5, 6] designed to train cosmonauts to
control spacecrafts and perform tasks during extravehicular activity. It is
important that using VR technologies can improve the quality of visual
perception for those tasks and operations that cannot be realized in
terrestrial conditions. Paper [7] presents solutions for training cosmonauts in
spacewalk, that include several tracking systems that allow determining
movements of human body's various parts (arms, legs, head). Authors of paper [8] developed
a simulator for control of manned spacecraft by joysticks as part of the planned
mission of the Moon exploration. Also hardware solutions [9] created in the
NASA laboratory and based on cosmonaut training by means of VR technologies are
widely used.
Several papers
are devoted to the task of cosmonaut rescue using jetpack [10, 11, 12]. In
paper [10], MATLAB package with control implementation in Simulink environment
is used for simulation of space jetpack. This approach is intended to
demonstrate cosmonaut motion, but not suitable for training. Alternative
solution consists in using external control devices, as was shown in [11] and
[12] by the example of creating a test bench for space jetpack control with
joystick or gamepad. In publications considered, simulation of cosmonaut motion
was carried out without VR technologies.
This paper
presents methods and approaches for training cosmonauts to control jetpack, which
are based on using VR technologies and developed virtual environment system. The
scientific novelty is that a control of jetpack model is executed by
interaction of real cosmonaut with elements of virtual three-dimensional
control panel. To that end, a virtual model of cosmonaut with a jetpack was created,
in which movement of his hands copies movement of the operator's real hands. With
this approach, control of virtual jetpack, and hence motion of cosmonaut model,
is carried out by interaction of real person's hands with elements of jetpack's
three-dimensional virtual control panel through virtual hands. Proposed
solution includes implementing dynamics and control for virtual model of
cosmonaut, his hands, as well as operation of control panel and thrusters of
space jetpack. Approbation of the results obtained in this paper was carried
out in our virtual environment system VirSim [13] developed at the SRISA RAS
and demonstrated their effectiveness by the example of solving the problem of
cosmonaut rescue.
Virtual
environment system (VES) that we created for solving the task of cosmonaut
training to use rescue equipment in the form of jetpack
includes hardware block, software
complex and digital visual models (DVM). Its structure is illustrated in Figure
1.
Fig. 1. VES
structure for cosmonaut rescue training.
Hardware block,
which consists of Manus Prime II VR gloves, Oculus Touch controllers and Oculus
Rift CV1 VR headset, has two functions. The first is tracking of the operator’s
head and hands. Despite good quality of finger tracking, Manus gloves don't
support determination of hand position in space but only their orientation. In
this regard, we propose to implement the missing functionality using Oculus
Touch controllers. Original and quite ergonomic placement of the Touch on hand
with Manus glove is shown in Figure 2. The second function of hardware block is
to display the synthesized stereopair to the operator's eyes. The Rift VR
headset provides this capability.
Fig. 2. Device
configuration of Oculus Touch and Manus Prime II on operator’s hand.
Software
complex is own original product and doesn't use third-party solutions. It
includes three subsystems, the tasks of which are implementation of virtual
object control, simulation of their dynamics and virtual environment visualization.
Subsystems run as separate processes controlled by a common shell. The shell
also provides for the transfer of most of the data between the subsystems using
special protocols. Control subsystem receives at its input information from
external devices included in the hardware block. In addition, it receives data
from state sensors of virtual control elements (buttons, toggle switches,
joysticks, etc.) which are parts of DVM. Based on obtained information,
computing of functional schemes [14] for virtual environment's dynamic object
control, and synthesis of control signals are performed. Functional schemes are
stored together with DVM and loaded to control subsystem during VES
initialization. Generated signals are transmitted to dynamics subsystem which
computes new positions and orientations of virtual environment objects, and
also performs collision detection and response between them. After that, it is
necessary to synthesize stereo image of virtual space and transfer it to
hardware block for display. This functionality is implemented in visualization
subsystem. Stereopair rendering with simulation of realistic object lighting
and shadowing in virtual environment is performed on multi-core graphics
processor (GPU) in real time, i.e. with frame rate of at least 25 times per
second. The result is stereopair that is transmitted into the Rift VR headset.
Fig. 3.
Interaction of Oculus devices with subsystems of software complex.
One of the
challenges encountered when implementing interconnection between hardware block
and software complex is that the initialization of channel for receiving/transmitting
information with Oculus devices is possible only within one process. So there's
no way to simultaneously access the devices both in control subsystem (for
reading tracking data from the Touch and the Rift) and visualization subsystem
(for sending stereopair to the Rift). To solve the problem, we propose an
approach that implements direct information link between Oculus VR equipment
and visualization subsystem, as well as transit channel for data required by
control subsystem from Oculus devices through visualization system to
functional scheme of cosmonaut model control via UDP network protocol (fig. 3).
To do this, a client module is implemented in visualization subsystem. It
interacts with VR devices via Oculus SDK and sends network packets to control
subsystem. And furthermore, special Oculus Control block is implemented in the
functional scheme editor's library, which is a part of scheme loaded to
control subsystem, and provides server function for recieving data from the
client module.
The DVM created
by us includes highly detailed three-dimensional virtual scene of cosmonaut
environment (models of the ISS and the Earth), which has about million polygons,
and virtual model of cosmonaut with a jetpack. Control scheme of cosmonaut
model motion (head rotations, translations and rotations of hands, bending of
fingers) and operation of rescue equipment's thrusters was also developed. Signals
from hardware block, as well as toggle switches and joystick located on virtual
control panel of a jetpack are inputs of the scheme. To receive signals from
Oculus devices, our Oculus control block considered above is added to this
scheme.
The interaction
of trained cosmonaut with virtual environment objects consists in copying the
movements of his hands with their virtual prototypes. Based on VR technologies,
this is done through hand tracking with VR gloves and controllers. To implement
this approach in developed VES, virtual models of cosmonaut's hands were
created, design of which was carried out in the computer simulation system 3ds
Max. Figure 4 shows a model of right hand, which consists of a set of bones and
geometry. Changing bones position and orientation leads to compressions and stretches
of the fabric upper layers. This behavior is specified using a special
skin
modifier by applying the coefficients of bones influence on vertices of model
geometry.
Fig. 4. Virtual
model of cosmonaut’s hand.
Hand motion
simulation is implemented in developed VES with using its dynamic model. In
this model, the hand is represented as an articulated body system. To describe
the motion of virtual hand, consider the CCS (the Cosmonaut Coordinate
System) of cosmonaut model’s torso, the HCS (the Hand coordinate system)
of hand itself, and the FCS (the Finger coordinate systems) of its
fingers. Hand position is defined relative to the torso with radius-vector
, and attitude – with rotations first around the
-axis on the angle
, then around
the
-axis on the angle
and finally around the
-axis on the angle
. In proposed
model, the motion of fingers is described by the bend angles
,
, where
K
is the number of hand finger links. Combine vector
and angles
,
,
,
, we get that hand motion is described by
generalized coordinates
,
.
The task of
virtual hand simulation is to ensure its motion, including in the presence of
collisions with virtual environment objects. In doing so, it is required to
carry out simulation of the contact, impact and friction between bodies, which
are formulated in the form of constraints imposed on the coordinates and
velocities of bodies. Then task described above is to find
q
for every
time instant
t
under such constraints.
The articulated
body dynamics with generalized coordinates is described using differential
equations [15]:
,
|
(1)
|
where
is the mass matrix,
is the Coriolis matrix,
is the vector of generalized forces (external
torques that applied at hand’s joints),
is the
constraint Jacobian matrix,
is the vector of Lagrange
multipliers,
M
is the number of constraints.
Differential
equations (1) describe the dynamics of virtual hand with constraints, where
and
are unknown
variables. To solve this task, an approach is proposed that is based on the
articulated-body method [16] and its extended version [17] for constraint
enforcing. The forward dynamics problem for solving equations (1) in the
articulated-body method is reduced to the following form
,
,
|
(2)
|
where functions
are computed
recursively.
Use the
semi-implicit Euler scheme for numerical integration of equations (2).
According to this scheme, first, the generalized velocities with
for the time instant
are computed:
.
|
(3)
|
The search for
a vector
is formulated as a Mixed Linear
Complementarity Problem (MLCP) [18]. This task is for a given matrix
and vector
to find
and
such that
,
,
|
(4)
|
where one of three conditions is satisfied:
In task (4), vector
elements
,
are computed
by differentiating the constraints on body velocities, and matrix elements
,
are determined
on the basis of extended articulated body method [17, 19]. Vector elements
and
are set
depending on the type of constraint (for example,
and
for contact
constraint). In this work, to solve task (4), the Projected Gauss-Seidel method
is used [20]. The idea is that
are computed
iteratively by solving equations
. If
does not satisfy (4), then projection is
performed on the feasible set. After computing
, the
generalized velocities are corrected as follows
.
|
(5)
|
Finally, at the
last stage of the semi-implicit Euler scheme, new generalized coordinates for
the time instant
are computed:
.
|
(6)
|
Virtual hand control is realized in copying mode by means of Manus VR
glove and Oculus Touch controller. With this approach, the readings of devices
set the vector
of desired generalized coordinates.
Proposed solution is that the torques
are formed in
the following form
,
|
(7)
|
where
and
are given
coefficients.
Thus, the task
of virtual hand motion simulation is implemented by applying toques according
to Eq. (7) and integrating equations (2) with the semi-implicit Euler scheme
using expressions (3), (5) and (6), where the vector
is computed by solving (4).
The motion of
cosmonaut in Earth's orbit is described relative to the ISS. To do this,
consider the world coordinate system
Oxyz
(WCS) and the local coordinate
system (LCS), which is rigidly attached to the ISS and cosmonaut, respectively
(fig. 5). Cosmonaut position is defined with the radius-vector
, and the attitude – with transformation matrix
R
from the LCS of cosmonaut to the WCS.
Fig. 5. Virtual
model of cosmonaut with jetpack.
The
translational (linear) motion of cosmonaut is provided under the action of
thrusters, the direction of which coincides with axes of the LCS. Since there
are four thrusters with the equal thrust
f
on each side of the jetpack,
total thrust force from all the thrusters forms a vector
relative to the LCS, where
,
.
In the absence
of gravity and without taking into account the ISS influence on cosmonaut
motion, the dynamics of its linear motion is described by differential
equations in the form of Newton's second law:
,
|
(8)
|
where
m
is the total mass of
cosmonaut, spacesuit and jetpack,
is the linear
velocity of cosmonaut in the WCS.
The rotational
motion of cosmonaut is provided under the action of torques created by jetpack’s
thrusters. It is assumed that the thrusters are located symmetrically relative
to cosmonaut's center of mass. Therefore, we obtain the total moment
relative to the LCS, where
,
,
,
,
and
are the
torques of jetpack’s thrusters.
The dynamics for
rotational motion of cosmonaut is described by the Euler differential equations
[21]:
where
,
and
are the
principal moments of inertia,
is the angular velocity
of cosmonaut in its LCS.
In that way,
the mathematical model for motion of cosmonaut with jetpack is described by the
equations (8) and (9), in which
,
,
,
,
and
are the control variables.
In this paper, an approach is proposed in which automatic and manual motion
control of cosmonaut model is implemented using virtual three-dimensional
control panel, the prototype of which is the hand controller module of the
SAFER rescue system [4]. Figure 6 shows created model of this control panel,
which contains three toggle switches, one button and four-axis joystick with a
button. Two-position toggle switch labeled “PWR” turns control panel on and
off, while two-position toggle switch labeled “MODE” is used to select mode for
rotational and translational motion of cosmonaut model (“ROT” and “TRAN”). The
joystick has four degree of freedom: displacement along the
X-axis and
rotations around the
Z,
Y
and
X
axes (in fig. 6, rotation
around the
X-axis is denoted as Rot). The button on the joystick of
control panel is used to turn on the mode of automatic attitude hold, in which
angular velocity of cosmonaut is damped. Created control panel also includes
the “RTRN” button to activate the mode of cosmonaut automatic return to
predetermined position.
Fig. 6. Virtual
model of space jet pack’s control panel.
Table 1. Control parameters of space jetpack in manual mode.
MODE =
«ROT»
|
MODE =
«TRAN»
|
X-
|
|
X-
|
|
X+
|
|
X+
|
|
X0
|
|
X0
|
|
Z-
|
|
Z-
|
|
Z+
|
|
Z+
|
|
Z0
|
|
Z0
|
|
Depending on
motion mode, changing the state of joystick to manual control sets the commands
to turn on and off the thrusters. The joystick offset along the
Y-axis
and its rotation around the
X-axis are available in both modes. When the
joystick is shifted along the
Y-axis, thrust
is generated, and when the joystick is turned,
torque
is generated. The control law of jetpack
for changing joystick positions in other directions is given in Table 1, where
signs “+”and “-” indicate positive and negative directions, respectively, and
“0” is neutral position.
The attitude
stabilization of cosmonaut model (damping of the angular velocity) is provided
in automatic mode by pressing the joystick’s button. Let us use the nonlinear
control theory to solve this task. Consider the positive definite Lyapunov
function
. From the theory of Lyapunov functions [22]
it follows that for the occurrence of sliding mode around the surface
, it is necessary and sufficient that the
inequality
|
(10)
|
is satisfied.
Substituting
equations (9) into (10) yields
.
|
Let's form the
relay control in the following form
Then inequality
(10) will be satisfied under the conditions
To reduce the
number of switches, it is proposed to use a dead zone with a given threshold
. Then the control law for attitude stabilization
around the
-axis of the LCS is
|
(13)
|
Expressions for
the torques
and
of attitude
stabilization around
and
axes of the LCS are formed in a similar way.
In turn,
proposed solution for automatic return of cosmonaut model in target position
is to ensure the motion velocity towards the
target:
where
is constant velocity
for cosmonaut model,
l
is distance.
Use the theory
of Lyapunov function and equations (8). Then the control law for the motion of
cosmonaut model along the
-axis of its LCS is
|
(14)
|
where
is the
residual vector in the LCS,
is given tolerance.
Similar
relations are valid for
and
.
Methods and
approaches proposed in this paper were implemented in our VES VirSim prototype
to train cosmonaut for rescue skills. Hardware platform of this system includes
the following components: high-performance computing unit based on Intel Core
i7-8700K CPU, NVidia RTX 2080 graphics card, sensors, Oculus Rift CV1 headset,
Oculus Touch controllers, Manus Prime II gloves, and display. Software complex
were implemented with using object-oriented language C++, the OpenGL 3D
graphics library, shader language GLSL 4.3 and the CUDA parallel computing
architecture.
Figure 7
illustrates a structure of functional scheme created for control of cosmonaut
model. In this scheme, hardware connection is implemented in the form of blocks
interacting
with hand and head tracking devices. At the output of Manus Prime II block and “Touch”
output of Oculus Control block (see section 2), target coordinates
and
are formed for
virtual models of left and right hands. Obtained coordinates together with
virtual sensor readings participate in torque computation according to Eq. (7).
This torques are transmitted to performing devices (controlled joints). In
turn, head rotation angles at “Rift” output of Oculus Control block are used to
compute voltages
U, which are transmitted to virtual camera's electric
motors. Computation of
U
is performed by means of PD controllers, taking
into account virtual camera's orientation sensor readings. When virtual hands
interact with virtual control panel's elements, vector
s
of control
commands is formed, which sets control of jetpack in manual or automatic mode.
These commands are then used to compute the thrust vector
and torque
, according to
which corresponding thrusters of space jetpack are switched on and off. In the
automatic mode, control algorithm is synthesized by formulas of the form (13)
and (14) based on sensor readings.
Fig. 7. Control
scheme structure.
Approbation of
proposed methods and approaches was performed on the example of emergency
simulation when cosmonaut has separated from the ISS surface and has non-zero
speeds. Training is implemented through immersing the operator in virtual
environment (see fig. 8, 9) by means of VR headset and gloves. Using his hands,
movements of which are copied by the virtual ones, the operator interacts with
virtual panel elements and thus controls cosmonaut model motion. Approbation
results showed the adequacy and effectiveness of solutions proposed in this
paper for training cosmonaut skills of space jetpack control, which in the
future can contribute to their rescue when separating from the ISS.
Fig. 8.
Operator training with use of VR technologies.
Fig. 9.
Stereopair observed by operator.
In the paper we
propose solutions for simulation of cosmonaut rescue process, based on modern
technologies of virtual reality. The use of a stereo headset provides a high
level of immersion of the operator in virtual environment and thereby improves
the quality of training his skills to control a space jetpack. Results obtained
in this paper can be used for cosmonaut training with practicing actions in
emergencies and various scenarios.
The publication is made within the state task of Federal State
Institution “Scientific Research Institute for System Analysis of the Russian
Academy of Sciences” on “Carrying out basic scientific researches (47 GP)” on
topic No. FNEF-2021-0012 “Virtual environment systems: technologies, methods
and algorithms of mathematical modeling and visualization. 0580-2021-0012” (Reg.
No. 121031300061-2).
1.
Fullerton
R.K. EVA Tools and Equipment Reference Book // NASA Johnson Space Center,
JSC-20466 Rev. B, Nov. 20, 1993.
2.
Ryzhkov
E. Hronika poleta jekipazha MKS [Chronicle of the ISS crew flight] // Russkij
kosmos, No. 7, 2017, pp. 8-13 [in Russian].
3.
Cygankov
O.S. Pjatidesjatiletie vnekorabel'noj dejatel'nosti [Fifty years of
extravehicular activity] // Kosmicheskaja tehnika i tehnologii, Vol. 1, No. 8,
2015, pp. 3-16 [in Russian].
4.
Kelly
J.C., Kemp K. Formal methods, specification and verification guidebook for
software and computer systems, volume II: A practitioner’s companion, planning
and technology insertion // Technical Report NASA-GB-001-97, NASA, Washington,
DC 20546, May 1997.
5.
Stone
R., Panfilov P., Shukshunov V. Evolution of aerospace simulation: From
immersive virtual reality to serious games // In Recent Advances in Space
Technologies (RAST), 2011 5th International Conference on, June 2011, pp.
655–662.
6.
Cater
J.P., Huffman S.D. Use of the remote access virtual environment network (RAVEN)
for coordinated IVA-EVA astronaut training and evaluation // Presence:
Teleoperators & Virtual Environments, Vol. 4, No. 2, 1995, pp. 103-109.
7.
Liu
Yuqing and others. VR simulation system for EVA astronaut training //
Proceedings of AIAA Space 2010 Conference & Exposition, Anaheim California,
2010.
8.
Bruguera
M.B., Ilk V., Ruber S., Ewald R. Use of virtual reality for astronaut training
in future space missions – spacecraft piloting for the Lunar Orbital Platform –
Gateway (LOP-G) // 70th International Astronautics Congress, Washington D.C.,
2019.
9.
Garcia
A. D., Schlueter J., Paddock E. Training astronauts using hardware-in-the-loop
simulations and virtual reality // AIAA SciTech Forum, Orlando, FL, 2020.
10.
Wen J., Zhang J., Gao L., Li X. Modeling and simulation
of Simplified Aid for EVA Rescue using virtual prototype technology // Open
Automation and Control Systems Journal, No. 6, 2014, pp. 1532-1540
.
11.
Handley
P.M., Robinson S.K., Duda K.R., Prasov Z., York S.P., West J.J. Real-time
performance metrics for SAFER self-rescue // In 45
th
International
Conference on Environmental Systems, 12-16 July 2015, Bellevue, Washington, pp.
1-14.
12.
Rize
J.P., Hoffman J., Carpenter M.D., Cohanim B. Real-time virtual reality
environment for MAJIC attitude control system development and integration // In
Proceedings of the 2014 IEEE Aerospace Conference, pp. 1-11.
13.
Mihaylyuk
M.V., Maltcev A.V., Timokhin P.Ju., Strashnov E.V., Krjuchkov B.I., Usov V.M.
Sistema virtual'nogo okruzhenija VirSim dlja imitacionno-trenazhernyh
kompleksov podgotovki kosmonavtov [The VirSim virtual environment system for
the simulation complexes of cosmonaut training] // Pilotiruemye polety v
kosmos, Vol. 4, No. 37, 2020, pp. 72-95 [in Russian].
14.
Mikhaylyuk
M.V., Torgashev M.A. Vizualnyi redaktor i modul rascheta funktsionalnykh skhem
dlia imitatsiono-trenazhernykh kompleksov [The visual editor and calculation
module of block diagrams for simulation and training complexes] // Programmnye
produkty i sistemy, No. 4, 2014, pp. 10-15 [in Russian].
15.
Shabana
A.A. Computational dynamics, 3rd edition, John Wiley & Sons Ltd, 2010.
16.
Featherstone
R. Rigid body dynamics algorithms, New Jork: Springer-Verlag, 2008.
17.
Mikhaylyuk
M.V., Strashnov E.V., Timokhin P.Yu. Algorithms of multibody dynamics
simulation using articulated-body method // Mathematica Montisnigri, No. 39,
2017, pp. 133-145.
18.
Garstenauer
H. A unified framework for rigid body dynamics. Master’s thesis, Johannes
Kepler Universitat, Linz, 2006.
19.
Kokkevis
E. Practical physics for articulated characters // Proceedings of Game
Developers Conference, 2004.
20.
Stepien
J. Physics-based animation of articulated rigid body systems for virtual environments.
Gliwice, 2013.
21.
Landau
L.D., Lifshitz E.M. Mechanics, 2
nd
edition. Course of theoretical
physics. Vol. 1, Pergamon Press, 1969.
22.
Shtessel Y.,
Edwards C., Fridman L., Levant A. Sliding Mode Control and Observation.
Birkhauser, New York: Springer, 2014, 356 p.