v1.3 Release Notes

Highlights in 1.3:

Upgrading from v1.2


As always please update the development or test instances of OnDemand installed at your center first before you modify the production instance.


The steps outlined below will only need to be done once when upgrading from versions <= 1.2 to 1.3. Future updates will be handled through the RPM going forward.

  1. This RPM installation will want to overwrite any local configuration or modifications you may have made in earlier versions of OnDemand, so we will first start by making backups of OnDemand and our apps:

    sudo mv /opt/ood /opt/ood.bak
    sudo mv /var/www/ood/apps/sys /var/www/ood/apps/sys.bak
  2. We have moved all configuration to be located underneath /etc/ood/config. This includes the YAML configuration file used to generate the Apache configuration file, the YAML configuration file used to generate the per-user NGINX (PUN) configuration files, and any configuration/customization files used for the individual OnDemand core apps.

    So you will want to copy over the configuration files ahead of time for best results:

    sudo cp $HOME/ood/src/ood-portal-generator/config.yml /etc/ood/config/ood_portal.yml


    This assumes you were using the ood-portal-generator to generate the Apache configuration file under /opt/rh/httpd24/root/etc/httpd/conf.d/ood-portal.conf. If you haven’t been doing this before then now is a good time to start. Feel free to contact us if you have trouble with this.

Copy over our app configuration (if they exist).

mkdir -p /etc/ood/config/apps
[[ -e "/var/www/ood/apps/sys.bak/dashboard/.env.local" ]] \
  && sudo mkdir -p /etc/ood/config/apps/dashboard \
  && sudo cp /var/www/ood/apps/sys.bak/dashboard/.env.local /etc/ood/config/apps/dashboard/env
[[ -e "/var/www/ood/apps/sys.bak/activejobs/.env.local" ]] \
  && sudo mkdir -p /etc/ood/config/apps/activejobs \
  && sudo cp /var/www/ood/apps/sys.bak/activejobs/.env.local /etc/ood/config/apps/activejobs/env
[[ -e "/var/www/ood/apps/sys.bak/myjobs/.env.local" ]] \
  && sudo mkdir -p /etc/ood/config/apps/myjobs \
  && sudo cp /var/www/ood/apps/sys.bak/myjobs/.env.local /etc/ood/config/apps/myjobs/env
[[ -e "/var/www/ood/apps/sys.bak/file-editor/.env.local" ]] \
  && sudo mkdir -p /etc/ood/config/apps/file-editor \
  && sudo cp /var/www/ood/apps/sys.bak/file-editor/.env.local /etc/ood/config/apps/file-editor/env
[[ -e "/var/www/ood/apps/sys.bak/shell/.env.local" ]] \
  && sudo mkdir -p /etc/ood/config/apps/shell \
  && sudo cp /var/www/ood/apps/sys.bak/shell/.env /etc/ood/config/apps/shell/env
[[ -e "/var/www/ood/apps/sys.bak/files/.env.local" ]] \
  && sudo mkdir -p /etc/ood/config/apps/files \
  && sudo cp /var/www/ood/apps/sys.bak/files/.env /etc/ood/config/apps/files/env

Copy over any custom initializers you may have created (if they exist)

[[ -e "/var/www/ood/apps/sys.bak/dashboard/config/initializers/ood.rb" ]] \
  && sudo mkdir -p /etc/ood/config/apps/dashboard/initializers \
  && sudo cp /var/www/ood/apps/sys.bak/dashboard/config/initializers/ood.rb /etc/ood/config/apps/dashboard/initializers/ood.rb
[[ -e "/var/www/ood/apps/sys.bak/activejobs/config/initializers/filter.rb" ]] \
  && sudo mkdir -p /etc/ood/config/apps/activejobs/initializers \
  && sudo cp /var/www/ood/apps/sys.bak/activejobs/config/initializers/filter.rb /etc/ood/config/apps/activejobs/initializers/filter.rb

Copy over your Job Composer templates (if they exist)

[[ -e "/var/www/ood/apps/sys.bak/myjobs/templates" ]] \
  && sudo mkdir -p /etc/ood/config/apps/myjobs \
  && sudo cp -r /var/www/ood/apps/sys.bak/myjobs/templates /etc/ood/config/apps/myjobs/.

Copy over your local Interactive Desktop apps (if they exist)

[[ -e "/var/www/ood/apps/sys.bak/bc_desktop/local" ]] \
    && sudo cp -r /var/www/ood/apps/sys.bak/bc_desktop/local /etc/ood/config/apps/bc_desktop

If all went well, you should have a directory structure that looks similar to:

tree /etc/ood/config
# /etc/ood/config
# ├── apps
# │   ├── activejobs
# │   │   └── ...
# │   ├── bc_desktop
# │   │   └── ...
# │   ├── dashboard
# │   │   └── ...
# │   ├── files
# │   │   └── ...
# │   ├── myjobs
# │   │   └── ...
# │   └── shell
# │       └── ...
# ├── clusters.d
# │   ├── my_cluster.yml
# │   └── ...
# ├── nginx_stage.yml
# └── ood_portal.yml
  1. Add Open OnDemand’s repository hosted by the Ohio Supercomputer Center:

    sudo yum install https://yum.osc.edu/ondemand/1.3/ondemand-release-web-1.3-1.el7.noarch.rpm
  2. Install OnDemand and all of its dependencies:

    sudo yum install ondemand
  3. Copy back any custom apps (e.g., Jupyter, RStudio, …) you installed previously from our backup directory:

    sudo cp -r /var/www/ood/apps/sys.bak/CUSTOM_APP /var/www/ood/apps/sys/.
  4. The installation will install all software and web apps. It will also generate a new Apache configuration file and restart Apache. So if all went well you should be able to access the OnDemand portal in your browser successfully.

Infrastructure Version Changes

Table 16 Infrastructure Component Versions




0.4.0 → 0.7.1 (diff)


0.3.1 → 0.5.0 (diff)




0.3.0 → 0.5.0 (diff)

Table 16 lists the versions as well as the previous version it was updated from for each component that make up the infrastructure for this release.

Application Version Changes

Table 17 Application Versions



Dashboard App

1.18.0 → 1.26.2 (diff)

Shell App

1.2.4 → 1.3.1 (diff)

Files App

1.3.6 → 1.4.1 (diff)

File Editor App

1.3.1 → 1.3.3 (diff)

Active Jobs App

1.5.2 → 1.6.2 (diff)

Job Composer App

2.6.1 → 2.8.3 (diff)

Desktops App

0.1.2 → 0.2.0 (diff)

Table 17 lists the versions as well as the previous version it was updated from for each of the system web applications in this release.


RPM based installation

This is the biggest change and constitutes the bulk of this release’s work. RPM based installation will now be the default and documented way to install and update OnDemand. The latest rpms that are in use at OSC can be accessed from https://yum.osc.edu/ondemand/latest/ and the stable releases will be available in specific release version directories, such as https://yum.osc.edu/ondemand/1.3/.

Store configuration under /etc instead of under app directories

To enable RPM based installation, two modifications have been made to all configuration for OnDemand:

  1. All configuration can be moved to files under /etc/ood

  2. Modifications to configuration only require Passenger app, Per User NINGX, or Apache server restart (with the exception of the ood-portal-generater config)

Since all configuration is stored under /etc/ood this directory can be easily managed by Puppet or versioned in a git repository.

Config changes no longer require app rebuilds

It is now much faster to make and test configuration changes. For most configuration changes, you can make the change and then select “Restart Web Server” from the “Help” or “Develop” dropdown to see the change.

Xfce support Interactive Desktops

We now have documentation for enabling Xfce 4+ as the desktop environment for OnDemand Interactive Desktops. Xfce is the desktop environment we now use internally at OSC. See Modify Form Attributes for documentation on how to use Xfce in OnDemand.

Cluster config verification script

A Rake task has been added to the Dashboard app that will submit and check the status of jobs for each cluster specified in the cluster config. This provides a quick way to verify that OnDemand has been properly configured for a new cluster and should speed up installation. See Test Configuration in the cluster configuration documentation for more details.

Ignore apps if they have a period in directory name

You can effectively hide apps from being displayed in the Dashboard by adding a period in the app’s directory name. This is useful if you want to make a backup of an app, e.g., ../myapp.bak/. Or just want to include a hidden directory in the app deployment directory, e.g., ../.hidden-app/.

Enable multiple Dashboard announcements with embedded HTML

Site-wide dashboard announcement support in OnDemand has been expanded. Orinally we supported a file /etc/ood/config/announcement.md but now a YAML file /etc/ood/config/announcement.yml can be used. A collection of markdown and yaml announcements in /etc/ood/config/announcements.d/ can be added. The yaml file provides extra benefits:

  1. Pre process file using ERB so that ERB tags can provide per-request dynamic modification of the announcement

  2. Control the color of announcement with :type which is the Bootstrap alert name (warning, info, success, or danger)

  3. Control whether announcement appears or not by setting :msg to a string or nil

The ability to use ERB means we can set the msg to nil after a certain time period. For example:

type: warning
msg: |
  <%- if Time.now < Time.new(2018, 1, 23, 15, 0, 0) -%>
  **NOTICE:** The Ruby nodes on the Quick cluster will go down on Tuesday
  January 23, 2018 from 1 - 3 pm for scheduled maintenance. This will affect
  only **Ruby VDI** sessions scheduled to run during this time period. These
  sessions will be put on hold until after the maintenance period is complete.
  <%- end -%>

In this example, the announcement appears on the dashboard until Jan 23 at 3:00. We’ll add more documentation for this soon. If you want to take advantage of this now just ask a question on the ood-users mailing list.

Better debugging of Interactive Apps by logging shell commands

Whenever an Interactive Session is started from the Dashboard, the shell command used to submit the job is logged to the user’s NGINX config to help with debugging Interactive Apps.

Job Composer: Optionally hide Account field in Job Options

The Job Composer provides a field in the Job Options form to set the Account, which when the job is submitted uses the appropriate account flag for the resource manager (whether it is -A or -P or --account, etc.). However, some sites do not use this, and others use different mechanisms for accounting. Long term we want to support flexible configuration of this web form, but for now we have added the ability to hide this Account field from the web form. This field is hidden by adding to the Job Composer’s env file: OOD_SHOW_JOB_OPTIONS_ACCOUNT_FIELD=0

Active Jobs: display list of nodes that a job is running on

In Active Jobs, if the resource manager provides it, the list of nodes a job is running on will display in the details section of the job.