C4.2 Visualisation Gateway (VG)

../_images/VG-Logo-new.jpg

The C4.2 Visualization Gateway (VG) provides a development environment for designing, rapid prototyping, implementing and fully testing complex visualisation solutions for realising common data exploration workflows. To serve as a universal core service for multiple users, the popular IPython based Jupyter Hub project has been selected. The service is built upon this and the framework for visual discovery developed in C4.1. C4.2 service is envisaged to be interconnected with C3.1 AI Gateway to visualise AI powered solutions, C4.3 to underpin powerful VR/AR solutions and C4.4 to facilitate end-user data accessibility.

The service is accessed from https://vis-gateway.neanias.eu/, and it is based on Jupyther Hub, to enable efficient visualisation of complex visualisation tasks with outcomes being seamlessly integrated in the overall scientific workflows.

Getting started

1) Login

To gain access to the gateway, you must first authenticate.
  • Navigate to the Visualisation Gateway
  • Click the ‘Sign In’ button
  • Click on ‘Neanias Identity Provider’
  • Authenticate using valid credentials (Microsoft or Google)

2) Spawning a notebook server

After login, you will have access to the JupyterHub server options. There are two main sections:
  • remote storage mode
  • image selection

Remote storage mode

Data is not generally stored in large volumes on a JupyterHub host. To make use of large datasets with the Visualisation tools, you have the option to mount to remote storage using the WebDAV protocol. If you have credentials for such storage, select the “Other” option and enter your credentials. Otherwise, select the “Auto-mount” option. This will mount to storage containing reference datasets used for demonstration purposes.

Image selection

You may choose from the available images
  • Splotch
  • VisIVO

After selecting an image click “start”. To run the demonstration, go to step 3. Otherwise go to step 4.

3) Running the demononstration

Pointers
  • Both Splotch and VisIVO images are equipped with ipynb (IPython Notebook) demonstrations
  • To run a demonstration, open the ipynb file and follow the instructions in the introduction and python cells

4) Processing other datasets

4.1 Splotch

To run Splotch, you will need a data file and a parameter file. The parameter file must point to the data file. There are many examples of these parameter files in the reference datasets.

Paths in the parameter files must be either:
  • absolute, e.g. ‘/path/to/file’
  • relative to directory in which Splotch6-generic is run, e.g. /current/dir + ‘./path/to/file’

You can run Splotch either from the terminal or an ipynb.

4.1.1 Terminal
  • click “New” on the server home page and select “terminal”
  • ascertain the location of the parameter file
  • run “Splotch6-generic /path/to/parameter-file”
  • examine the resulting output, which will be produced in the present working directory
4.1.2 ipynb
  • click “New” on the server home page and select “notebook”
  • ascertain the location of the parameter file
  • type “!Splotch6-generic /path/to/parameter-file” in a python cell (including the “!”)
  • run the cell
  • examine the resulting output, which will be produced in the same directory as the iypnb

4.1.3 Animation

A Splotch animation can be created using a scene file (to produce multiple frames) and ImageMagick (to join the frames)
  • example scene files are shown in the reference datasets, e.g. D1/parameters/xy.scene
  • a scene file consists of:
  • a header, containing the Splotch parameters to be modified during the execution
  • a set of values of the same length as the header, which each line representing the parameter values for a frame
  • the scene file must be referenced in the parameter file, e.g. ‘scene_file=/path/to/scenefile.scene’
  • the file extension must be .scene

You can either create a scene file from scratch, or use the “create-an-orbit-scene-file” python notebook available in the home directory of the Splotch JupyterHub environment.

An example of a 2 frame execution where by the x and y camera parameters change between frames:
  • camera_x camera_y
  • 4000 3000
  • 4500 2500

After creating the scene file, reference this in the parameter file as mentioned above, and run Splotch according to 4.1.1 or 4.1.2.

There are 2 commands to be run after Splotch to produce the animation
  • ‘magick –appimage-extract-and-run mogrify -format png $outfile*.tga’ - (convert tga files to png)
  • ‘magick –appimage-extract-and-run convert -delay 10 $outfile*.png -loop 0 animation.gif’ - (join the frames)
  • where ‘outfile’ is the same as the outfile in the parameter file
  • the ‘-delay’ flag determines the delay between frames in the animation, in 10’s of milliseconds
  • finally you may want to removed the original .png files with ‘rm $outfile*.png’

4.2 VisIVO

To run VisIVO, you will need a data file. This data file can be processed via the terminal or an ipynb. For the purpose of this guide, you will use two VisIVO tools, the Importer and the Viewer. Typically data can be processed using the commands on the second page of this VisIVO user guide (https://github.com/inaf-oact-VisIVO/VisIVOServer/blob/master/VisIVOImporter2.0.pdf). More information on how to run VisIVO can be found in the other user guides (https://github.com/inaf-oact-VisIVO/VisIVOServer).

4.2.1 Terminal
  • click “New” on the server home page and select “terminal”
  • ascertain the location of the data file
  • run VisIVO as described in the user guide
  • examine the resulting output, which will be produced in the present working directory
4.2.2 ipynb
  • click “New” on the server home page and select “notebook”
  • ascertain the location of the data file
  • run VisIVO as described in the user guide
  • run the cell
  • examine the resulting output, which will be produced in the same directory as the iypnb