SciNet News November 2013

November 15, 2013 in for_researchers, for_users, newsletter

EVENTS COMING UP

Unless stated otherwise, all events take place at the SciNet Headquarters, Rm 235 of 256 McCaul Street, Toronto. All events below are free for users but we ask that you sign up (“enroll”) on the education website.

  • Tuesdays and Thursdays, 11:00 am – 12:00 noon

    November 5 – 28

    INTRODUCTION TO RESEARCH COMPUTING WITH PYTHON

    Learn about research computing even with little programming experience. Basics of programming in python, best practices and visualization will be covered in 8 lectures.

    This course can be taken as a “mini-course” by astrophysics graduate students and as a “modular course” by physics graduate students.

    Participation also counts towards the SciNet Scientific Computing Certificate.

    For more information go to the course website.

    Note: this course has already started.

  • Tuesday November 26, 2:30 pm – 3:30 pm

    COAST-TO-COAST SEMINAR

    Collaboration across disciplines in the design of effective technologies to support older adults

    Dr. Alex Mihailidis (University of Toronto)

    Abstract:

    As the complexity in the needs of older adults continues to increase, so do the requirements from the technologies that we are designing. No longer can we take a uni-dimensional approach in the design approach that has often been used in the past, but research and development in this field requires input from a multitude of stakeholders, who must all play a greater role in our traditional design methodologies. The talk will discuss how collaboration across different technical and clinical disciplines is needed to design technologies that can effectively support and help older adults. It will discuss different approaches that are currently being used to include end users in the design process, and will present examples of technologies that have been developed.

  • Wednesday December 11, 10:30 am – 11:30 am

    INTRO TO SCINET

    A class of approximately 90 minutes where you will learn how to use the systems. Experienced users may still pick up some valuable pointers during these sessions.

    Participation counts towards the SciNet HPC Certificate.

    For more information and enrollment, go to the course website

  • Wednesday December 11, 12:00 noon – 1:00 pm

    SCINET USER GROUP (SNUG) MEETING

    TechTalk: TBA

    For more information and enrollment, go to the course website.

  • Winter 2014:

    SCIENTIFIC COMPUTING COURSE

    Many computational projects start off with knowledge of the science you want to do, and with a bit of programming experience. It can be an arduous journey to get to a (maintainable) piece of code which you trust to compute the right thing. This course is aimed at reducing your struggle, and make you a more efficient computational scientist. Topics include well-established best practices for developing software as it applies to scientific computations, common numerical techniques and packages (so you don’t reinvent the wheel), and aspects of high performance computing.

    The course consists of three parts:

    1. Scientific Software Development & Design
    2. Numerical Tools for Physical Scientists
    3. High Performance Scientific Computing

    Each part consists of eight lectures of one hour. Students with limited programming experience are encouraged to take “Introduction to Research Computing” first (see above).

    Note that these parts can be taken as “mini-courses” by astrophysics graduate students and as “modular courses” by physics graduate students.

    Participation in parts 1 and 2 counts towards the SciNet Scientific Computing Certificate.

    Participation in part 3 counts towards the SciNet HPC Certificate.

    For more information (soon) and enrollment, go to
    the course website for the first part,
    the course website for the second part, or
    the course website for the third part.

SYSTEM NEWS

  • The user-contributed x86 system “Sandy” is now available for job submission by other SciNet users as well. Jobs would be scheduled with default priority, depending on node availability. This Sandybridge x86 cluster has 76 nodes with 16 cores and 64GB of RAM per node. For more information on the system and how to use it, see the wiki.

  • Access to the user-contributed GPU system “Gravity” can now be requested by users (similar to ARC). Gravity is a cluster of 49 nodes, each node with 12 cores, 32 GB, and two NVIDIA Tesla M2090 GPUs. The GPUs have CUDA capability 2.0, 512 CUDA cores and 6 GB of RAM. For more information on the system and how to use it, see the wiki.

  • BGQ: Papi 5.2.0, a library to access performance counters, installed.

  • BGQ: MemP 1.0.3, a memory profiling tool, installed.

  • GPC: Octopus 4.1.1, an ab-initio package, installed.

  • GPC: SamTools, a sequence alignment/map format library, installed.

  • GPC: PFFT 1.0.7-alpha, a parallel portable library for computing fast Fourier transforms, installed.

  • GPC: Trilinos 11.4.2 module installed.

  • GPC: Open Babel 2.3.2 module installed.

  • GPC: Ncview 2.1.2 module installed.

  • GPC: NetCDF 4.2.1.1 modules installed.

  • GPC: HDF5 1.8.11 modules installed.

  • GPC: Parallel netCDF 1.3.1 modules for intelmpi and openmpi installed on GPC

  • GPC: GDAL 1.9.2 installed as a module

  • BGQ: MemP 1.0.3, a memory profiling tool installed.

  • P7: Gcc 4.8.1 installed.

  • GPC: Rsync 3.1.0 installed as a module.

  • GPC: User-space MySQL module installed.

ADDED TO THE WIKI

All new wiki content below is listed and linked on the main page.

  • Instructions on using “Sandy” and “Gravity” (links are in the sidebar)

  • Slides of the MPI 3.0 Developer Seminar

  • Slides of the SNUG TechTalk on MySQL on GPC

  • Slides and Recordings of the Intro to Relational Databases

  • Slides and Recordings of the first four lectures of Research
    Computing with Python

WHAT HAPPENED AT SCINET IN THE LAST MONTH?

  • October 16: SNUG Meeting with TechTalk on MySQL on GPC

  • October 23: SciNet developer seminar on MPI 3

  • October 30: Relational database basics

  • November 13: SNUG Meeting with TechTalk on Molecular Motors by Peter Colberg

  • Nov 4-14: First four lectures of Research Computing with Python

SciNet News October 2013

October 10, 2013 in for_researchers, for_users, newsletter

EVENTS COMING UP

Unless stated otherwise, all events take place at the SciNet Headquarters, Rm 235 of 256 McCaul Street, Toronto. All events below are free for users but we ask that you sign up (“enroll”) on the education website: https://support.scinet.utoronto.ca/education.

  • Compute Canada Research-Needs Survey

    Compute Canada (CC) is seeking input from the research community to help shape how research computing is provided to Canadian researchers for the next five years. A confidential 15 minute survey can be filled out at here.

    Compute Canada increasingly provides Canadian researchers with their computing services, so it is critical for us to hear about your needs.

  • Wednesday October 16, 2013, 3:00 pm (Eastern)

    DEADLINE FOR 2014 COMPUTE CANADA RESOURCE ALLOCATION PROPOSALS

    For more info see the Compute Canada website

  • Wednesday October 16, 12:00 noon – 1:00 pm

    SCINET USER GROUP (SNUG) MEETING

    This time, we will have

    • TechTalk:MySQL on GPCby Ramses van Zon (SciNet)
    • User discussion
    • Pizza!

    For more information and enrollment, go to the course website

  • Wednesday October 23, 3:00 pm – 4:00 pm

    SCINET DEVELOPER SEMINAR

    MPI 3: What is new? (Scott Northrup, SciNet)

    Participation counts towards the SciNet Scientific Computing or High Performance Computing Certificate.

    For more information and enrollment, go to the course website.

  • Wednesday October 30, 2:00 pm – 5:00 pm

    RELATIONAL DATABASE BASICS

    Participation counts towards the SciNet Scientific Computing Certificate.

    For more information and enrollment, go to the course website.

  • Tuesdays and Thursdays, 11:00 am – 12:00 noon
    November 5 – 28

    INTRODUCTION TO RESEARCH COMPUTING

    Learn about research computing even with little programming experience. Basics of programming in python, best practices and visualization will be covered in 8 lectures.

    This course can be taken as a “mini-course” by astrophysics graduate students and as a “modular course” by physics graduate students.

    Participation also counts towards the SciNet Scientific Computing Certificate.

    For more information and enrollment, go to the course website.

  • Wednesday November 13, 12:00 noon – 1:00 pm

    SCINET USER GROUP (SNUG) MEETING

    TechTalk: Molecular and mesoscale simulations using OpenCL and LuaJIT (Peter Colberg)

    For more information and enrollment, go to the course website.

  • Wednesday December 11, 10:30 am – 11:30 am

    INTRO TO SCINET

    A class of approximately 90 minutes where you will learn how to use the systems. Experienced users may still pick up some valuable pointers during these sessions.

    Participation counts towards the SciNet HPC Certificate.

    For more information and enrollment, go to the course website.

  • Wednesday December 11, 12:00 noon – 1:00 pm

    SCINET USER GROUP (SNUG) MEETING

    TechTalk: TBA

    For more information and enrollment, go to the course website.

  • Winter 2014:

    SCIENTIFIC COMPUTING COURSE

    Many computational projects start off with knowledge of the science you want to do, and with a bit of programming experience. It can be an arduous journey to get to a (maintainable) piece of code which you trust to compute the right thing. This course is aimed at reducing your struggle, and make you a more efficient computational scientist. Topics include well-established best practices for developing software as it applies to scientific computations, common numerical techniques and packages (so you don’t reinvent the wheel), and aspects of high performance computing.

    The course consists of three parts:

    1. Scientific Software Development & Design
    2. Numerical Tools for Physical Scientists
    3. High Performance Scientific Computing

    Each part consists of eight lectures of one hour. Students with limited programming experience are encouraged to take “Introduction to Research Computing” first (see above).

    Note that these parts can be taken as “mini-courses” by astrophysics graduate students and as “modular courses” by physics graduate students.

    Participation in parts 1 and 2 counts towards the SciNet Scientific Computing Certificate.

    Participation in part 3 counts towards the SciNet HPC Certificate.

    For more information (soon) and enrollment, go to
    the course website for the first part,
    the course website for the second part, or
    the course website for the third part.

SYSTEM NEWS

  • GPC: git-annex is available as a module.
  • GPC: armadillo 3.910.0 template libraries added as a module.
  • GPC: Version 3.14.1 of paraview server installed.
  • GPC: Python 2.7.5 installed as a module.
  • GPC: Allinea MAP 4.1 profiler available on the GPC as part of the
    ddt/4.1 module.
  • GPC: Intel compiler 14.0.0 available as a module.
  • TCS: Compilers have been updated (patched).
  • TCS: There’s now a cmake module.
  • TCS: New versions of hdf5, 1811-v18-poe-xlc and 1811-v18-serial-xlc.
  • TCS: New version of parallel-netcdf (1.3.1).
  • P7: Python 2.7.5 available as a module.
  • BGQ: openFOAM module.
  • BGQ: Reminder: there is a HPSS/BGQ bridge so BGQ users can now directly offload their data to their HPSS space via the GPC archive queue. See wiki for details.

the main page:

ADDED TO THE WIKI

All new wiki content below is listed and linked on the main page

  • Slides of the SNUG TechTalk on Git-annex
  • Slides and recording of the SciNet Developer Seminar on “OpenMP 4”
  • Instructions on using OpenFoam on the BGQ

WHAT HAPPENED AT SCINET IN THE LAST MONTH?

  • September 10, Intro to the linux shell
  • September 11, Intro to SciNet
  • September 11, SNUG meeting w/techtalk on git-annex
  • September 25, SciNet Developer Seminar on “OpenMP 4: What is new?”
  • October 9, Intro to SciNet

SciNet News November 2012

November 5, 2012 in for_researchers, for_users, newsletter

EVENTS COMING UP

  • Wednesday November 7, 10:00 am – 11:30 am

    INTRO TO SCINET

    Learn what SciNet resources are available, how to compile your code and how to use the batch system, in approximately 90 minutes. Intended for new users, but experienced users may still pick up some valuable pointers.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/80

  • Friday November 9, 12:30 pm – 14:00 pm

    SCINET DEVELOPER SEMINAR

    By Fernando Perez, the inventor of the IPython Notebook.

    Because of the lunch-hour scheduling, we will also be providing pizza!

    For pizza planning purposes, please sign up at https://support.scinet.utoronto.ca/courses/?q=node/84

  • Tuesday November 13, 2:30 pm – 3:30 pm

    COAST-TO-COAST SEMINAR SERIES

    “Oil & Fish Tails: Cuts to Canada’s environment and the changing face of Metro Vancouver’s oil and gas industry”

    Mr. Fin Donnelly and Mr. Kennedy Stewart (Members of Parliament)

    More info at http://c2c.irmacs.sfu.ca

  • Wednesday November 14, 12:00 noon – 1:00 pm

    SNUG MEETING W/TECHTALK

    GNU PARALLEL FOR LARGE BATCHES OF SMALL JOBS

    There will also be pizza and discussion.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/71

  • Tuesday November 27, 2:30 pm – 3:30 pm

    COAST-TO-COAST SEMINAR SERIES: Dr. Rob Macdonald (Fisheries and Oceans Canada)

    More info at http://c2c.irmacs.sfu.ca

  • Wednesday November 28, 10:00 am 5:00 pm

    PARALLEL DEBUGGING WITH DDT

    For more information and sign up, see https://support.scinet.utoronto.ca/courses/?q=node/82

  • Wednesday December 12, 12:00 noon – 1:00 pm

    SNUG MEETING

    TechTalk: TBA

    There will also be pizza and discussion.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/72

  • January 2013: GRADUATE COURSE IN SCIENTIFIC COMPUTING.

    Details to follow.

SYSTEM CHANGES

ADDED TO THE WIKI

All new wiki content below is listed and linked on the main page: http://wiki.scinethpc.ca/wiki/index.php/SciNet_User_Support_Library#What.27s_New_On_The_Wiki)

  • BGQ system at SciNet: Slides on the Oct 10 SNUG TechTalk by Scott Northrup.
  • For our friendly users of the BGQ, there is now a continually updated wiki page to serve as a BGQ Quickstart.
  • OpenACC: Slides from the Sept 19 SciNet Developer Seminar by Mark Ebersole from NVIDIA.
  • Performance Tuning with the IBM XL Compilers: Slides from the Sept 18 SciNet Developer Seminar by Kit Barton from IBM.
  • Perl page updated on how to setup local modules with cpan.

WHAT ELSE HAPPENED AT SCINET IN OCTOBER?

  • October 1: Maintenance downtime.
  • October 10: Intro to the Linux shell
  • October 10: SNUG meeting w/Techtalk about the new BGQ system.
  • October 18, 19 and 22: Power-related downtime
  • October 23: SciNet Developer Seminar on the Julia language by Michael Nolta (CITA, Toronto).

SciNet News September 2012

September 5, 2012 in for_researchers, for_users, newsletter

EVENTS COMING UP

  • September 5-6, 10:00 pm – 2:00 am: NO CONNECTION TO SCINET

    For router maintenance, users will not be able to log into SciNet during this window. Running jobs will not be affected.

  • Wednesday September 12, 10:00 am – 11:30 am

    INTRO TO SCINET

    Learn what SciNet resources are available, how to compile your code and how to use the batch system, in approximately 90 minutes. Highly recommended for new users, but experienced users may still pick up some valuable pointers.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/74

    You are encouraged to stick around for the immediately following SNUG meeting (please sign up separately).

  • Wednesday September 12, 12:00 noon – 1:00 pm

    SNUG MEETING

    The SciNet Users Group (SNUG) meetings are held every month on the second Wednesday, and involve pizza, user discussion, feedback, and one or two short talks on topics or technologies of interest to the SciNet community. These ‘TechTalks’ are intended to be given by users (as well as occasionally by SciNet analysts), and provide an opportunity to share your SciNet experiences, tips and tricks.

    The subject of this month’s TechTalk is:

    SCIENCE IS DATA

    or

    Why Optimizing Your Workflow and Data Management on SciNet
    Enables More and Better Science, With a Bird's-Eye View of How
    To Achieve This Based on Several Successful Use-Cases, and
    Including The Possible Use of HPSS for Big Data.

    Presented by several of the SciNet’s analysts.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/69

  • Monday September 17, 2:00 pm – 4:00 pm

    SCINET DEVELOPER SEMINAR

    IBM XL COMPILERS AND OPTIMIZATION

    By Kit Barton (IBM Toronto Compiler Team)

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/75

  • Tuesday September 25, 9:00 am – 5:00 pm

    WORKFLOW OPTIMIZATION FOR LARGE SCALE BIOINFORMATICS

    For more information and sign up, see https://support.scinet.utoronto.ca/courses/?q=node/76

  • Wednesday October 10, 10:00 am – 12:00 noon

    INTRO TO THE LINUX SHELL

    Extremely useful for new users of SciNet that are not yet familiar with the Linux shell or other unix prompt systems.

    For more information (soon) and sign up, see https://support.scinet.utoronto.ca/courses/?q=node/77

  • Wednesday October 10, 12:00 noon – 1:00 pm

    SNUG MEETING W/TECHTALK:

    OVERVIEW OF THE NEW BLUE GENE/Q SYSTEM AT SCINET

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/70

  • Tuesday October 23, 2:00 pm – 4:00 pm

    SCIDEV SEMINAR

    THE JULIA LANGUAGE

    By Michael Nolta (CITA, Toronto)

    From julialang.org:

    "Julia is a high-level, high-performance dynamic programming
    language for technical computing, with syntax that is familiar
    to users of other technical computing environments."

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/78

  • Tuesday October 30, 9:00 am – 5:00 pm

    HYBRID OPENMP/MPI PROGRAMMING

    For more information and sign up, see https://support.scinet.utoronto.ca/courses/?q=node/79

  • Wednesday November 7, 10:00 am – 11:30 am

    INTRO TO SCINET

    Learn what SciNet resources are available, how to compile your code and how to use the batch system, in approximately 90 minutes. Intended for new users, but experienced users may still pick up some valuable pointers.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/80

  • Wednesday November 7, 12:00 noon – 1:00 pm

    SNUG MEETING

    TechTalk: TBA (Want to present? Email support@scinet.utoronto.ca)

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/71

  • Tuesday November 20, 2:00 pm – 4:00 pm

    SCIDEV SEMINAR

    OPENACC: PROGRAMMING GPUS USING COMPILER DIRECTIVES

    For more information and sign up, see https://support.scinet.utoronto.ca/courses/?q=node/81

  • Tuesday November 27, 9:00 am 5:00 pm

    PARALLEL DEBUGGING WITH DDT

    For more information and sign up, see https://support.scinet.utoronto.ca/courses/?q=node/82

  • Wednesday December 12, 12:00 noon – 1:00 pm

    SNUG MEETING

    TechTalk: TBA (Want to present? Email support@scinet.utoronto.ca)

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/72

SYSTEM CHANGES

  • Scratch purging: Despite the three month limit, scratch is starting to get full. Users with material on scratch that they could delete or can move elsewhere, are strongly encouraged to do so. Treating scratch as permanent storage by touching your files so they will not get deleted is not an acceptable usage. Remember that scratch is not backed-up, so there is no protection against e.g. accidental deletion or overwriting of files.
  • GPC: The GPC queues now have a minimum walltime of 15 minutes. For very short test runs, please use the debug queue, which does not have this restriction.
  • GPC: new version of GNU parallel in module gnu-parallel/20120622. The older version from 2010 is still the default, because a few flags have changed.
  • TCS: New module gmake/3.82.
  • TCS and P7: New versions of the C, C++, and Fortran compilers are available in the modules vacpp/12.1 and xlf/14.1, respectively. Versions vacpp/11.1 and xlf/13.1 are still the default.
  • ARC: The scheduler on the ARC is now integrated with the GPC.
  • ARC: Portland Group compilers (version 12.6) are now available as module pgi/12.6. Having access to these compilers is one of the perks of being an “NVIDIA CUDA Research Center”. These PGI compilers have support for GPGPU programming through Fortran CUDA and OpenACC, in addition, of course, to the traditional support for C, C++, Fortran, OpenMP and MPI. This version supports cuda 4.2.

    For more info, see the wiki page for the GPU nodes of the ARC.

  • BGQ: The Blue Gene/Q is being delivered. More details later…

ADDED TO THE WIKI

All new wiki content below is listed and linked on the main page: http://wiki.scinethpc.ca/wiki/index.php/SciNet_User_Support_Library#What.27s_New_On_The_Wiki)

  • The wiki front page now has a separate ‘System Update’ box. Whenever you are wondering if we have changed something on the system, check here first.
  • Updated FAQ entry to account for minimum 15 minute walltime.
  • Info on submitting jobs on GPU Cluster, now integrated into GPC scheduler.
  • How to use the PGI compilers supporting OpenACC and Cuda Fortran on the GPU cluster.

WHAT ELSE HAPPENED AT SCINET IN JULY/AUGUST?

  • July 9: Intro to SciNet
  • July 8, 15, 26: Power glitches took down the data centre. Systems were brought back up on the same day.

SciNet News June 2012

June 5, 2012 in for_researchers, for_users, newsletter

EVENTS COMING UP

  • Thu Jun 7: SCHEDULED SHUTDOWN OF ALL SCINET SYSTEMS

    There will be a full SciNet shutdown on June 7th from 6AM to at least 10PM.

    This is the final scheduled shutdown in preparation for the installation of the IBM Blue Gene/Q system. A new machine room has been built (walls, raised floor, cooling unit, electrical and water connections), but downtime is required to connect 800 kW of power from our electrical room to the new room.

    All systems will go down at 6 AM on Thu 7 Jun; all login sessions and jobs will be killed at that time.

    At the earliest, the systems will be available again around 10PM in the evening of Thu 7 Jun. Check the SciNet wiki (wiki.scinethpc.ca) for updates on Thursday.

  • Wed Jun 13, 12:00 noon – 1:00 pm: JUNE SCINET USER GROUP (SNUG) MEETING

    The SciNet Users Group (SNUG) meetings are every month on the second Wednesday, and involve pizza, user discussion, feedback, and one or two short talks on topics or technologies of interest to the SciNet community.

    • TechTalk by Ramses van Zon on “Remote development on SciNet systems”
    • User discussion
    • Pizza!

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/66

  • Jun 25-28: Ontario Summerschool on High Performance Computing / Central

    The Ontario Summerschool on High Performance Computing provides attendees with opportunities to learn and share knowledge and experience in high performance and technical computing. This year, the Ontario Summerschool on High Performance Computing will have more than one installment. The first, in London, is from Jun 4-7. The second will be in Toronto, hosted by SciNet, from Jun 25-28.

    The format will be a four day workshop with mixed lectures and hands-on sessions on a number of selected subjects, including shared memory programming, distributed memory programming and general purpose graphics processing unit programming. The graphics cards and distributed memory sessions will be given in parallel.

    This event will be held at the University of Toronto (St. George campus), but not at the SciNet Headquarters.

    For more information on this event, see the temporary description at

    https://support.scinet.utoronto.ca/courses/?q=node/67

    Next week the official registration site should be accessible at

    http://www.sharcnet.ca/event/ss2012-central

  • Wed July 9, 12:00 noon – 13:30 noon: INTRO TO SCINET

    Learn what SciNet resources are available, how to compile your code and how to use the batch system, in approximately 90 minutes.

    Intended for new users, but experienced users may still pick up some valuable pointers.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/68

SYSTEM CHANGES:

  • GPC: a newer git version 1.7.10 is now available as a module (the default is still 1.7.1).
  • GPC: silo is installed as a module
  • GPC: gcc 4.7.0 available as module (version 4.6.1 is still the default)
  • HPSS: Jobs will now run automatically.
  • ARC: cuda 4.1 and 4.2 are available as modules (Note: 4.2 is not supported by the ddt debugger). cuda/4.1 will be the default as of Jun 8.
  • P7: ncl available as a module
  • P7: scons available as a module

ADDED TO THE WIKI IN MAY

All new wiki content below is listed and linked on the main page: http://wiki.scinethpc.ca/wiki/index.php/SciNet_User_Support_Library#What.27s_New_On_The_Wiki)

  • Intro to SciNet slides
  • ADIOS TechTalk, slides and source code
  • Updates to the HPSS page

WHAT ELSE HAPPENED AT SCINET IN MAY?

  • May 8-9: Scheduled shutdown of all SciNet systems for maintenance and system testing.
  • May 9: Intro to SciNet session
  • May 9: May SNUG meeting with TechTalk by Jonathan Dursi on “Parallel I/O doesn’t have to be so hard: The ADIOS library”.
  • May 12: SciNet participated in Science Rendezvous.
  • May 14-18: SCICOMP, the IBM HPC Systems Scientific Computing User Group (which is part of the meeting of SPXXL, the user group of large IBM installations).

SciNet News May 2012

May 5, 2012 in for_researchers, for_users, newsletter

SYSTEM CHANGES

  • GPC: The GPC has been upgraded to a low-latency, high-bandwidth Infiniband network throughout the cluster. Several significant benefits over the old ethernet/infiniband mixed setup are expected, including:

    • better I/O performance for all jobs
    • better job performance for what used to be multi-node ethernet jobs (as they will now make use of Infiniband),
    • for users that were already using Infiniband, improved queue throughput (there are now 4x as many available nodes), and the ability to run larger IB jobs.

    The temporary mpirun settings that were recommended before for multinode ethernet runs, are no longer in effect, as all MPI traffic is now going over InfiniBand. For most cases, “mpirun -np X ” will work. For more details on running mpi jobs specifically on, for instance, qdr-infiniband nodes, see the wiki page on ‘GPC MPI Versions.’

    We are very interested in learning about your experiences (positive or negative) with the new infiniband network, which you can email to support@scinet.utoronto.ca.

EVENTS COMING UP

  • May 8-9: SCHEDULED SHUTDOWN OF ALL SCINET SYSTEMS

    There will be a full SciNet shutdown from Tue May 8 to Wed May 9 for final configurations in the changeover to full infiniband for the GPC, for some back-end maintenance, and to test the computational and file system performance of the TCS and GPC.

    Systems will go down at 9:00 am on May 8; all login sessions and jobs will be killed at that time. The system should be available again in the evening of the next day. Check the wiki on Wednesday for updates.

  • Wed May 9, 10:30 am – 12:00 noon: INTRO TO SCINET

    Learn what SciNet resources are available, how to compile your code and how to use the batch system, in approximately 90 minutes.

    Intended for new users, but experienced users may still pick up some valuable pointers.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/65

    Note that attendants to the Intro may be interested in the immediately following event:

  • Wed May 9, 12:00 noon – 1:00 pm: MAY SCINET USER GROUP (SNUG) MEETING

    The SciNet Users Group (SNUG) meetings are every month on the second Wednesday, and involve pizza, user discussion, feedback, and one or two short talks on topics or technologies of interest to the SciNet community.

    This time, we will have:

    • TechTalk by Jonathan Dursi on “Parallel I/O doesn’t have to be so hard: The ADIOS library”
    • User discussion
    • Pizza!

    Sign up at: https://support.scinet.utoronto.ca/courses/?q=node/51

  • May 14-18: SCICOMP

    SciNet will host the annual meeting of ScicomP, the IBM HPC Systems Scientific Computing User Group. This meeting (which is part of the meeting of SPXXL, the user group of large IBM installations) is open to users and deals mostly with applications and science rather than just with technical aspects of the computers.

    This event will not be held at the SciNet Headquarters. For more information on this event, its schedule, location and registration, go to http://spscicomp.org/scicomp2012 .

  • Wed Jun 13, 12:00 noon – 1:00 pm: JUNE SNUG MEETING

    • TechTalk by Ramses van Zon on “Remote development on SciNet systems”
    • User discussion
    • Pizza!

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/66

ADDED TO THE WIKI IN APRIL

All new wiki content below is listed and linked on the main page: http://wiki.scinethpc.ca/wiki/index.php/SciNet_User_Support_Library#What.27s_New_On_The_Wiki)

  • Updates to all things pertaining to MPI on the GPC, in light of its new Infiniband network, including

    • Updated User Tutorial
    • Updated Scheduler Information
    • Updated GPC Cluster Information
  • Slides of the one-day c++ course.
  • A FAQ entry on how to deal with ib memory problems
  • Slides of the TechTalk on “Infiniband on the GPC”

WHAT ELSE HAPPENED AT SCINET IN APRIL?

  • Apr 11: SNUG meeting was held, with a TechTalk by Scott Northrup on “Infiniband on the GPC”
  • Apr 12: Scheduled shutdown on the TCS
  • Apr 18-19: Scheduled shutdown of all SciNet Systems for preparations for the new Blue Gene/Q system
  • Apr 19: GPC upgraded to full Infiniband
  • Apr 23: One day course on Scientific C++

The Portal: Volume 2

April 20, 2012 in for_press, newsletter

SciNet’s newsletter for 2012.

The_Portal_V2

Download a PDF copy

SciNet News March 2012

March 5, 2012 in for_researchers, for_users, newsletter

EVENTS COMING UP

  • Wed Mar 12, 12:00 noon: SCINET USER GROUP (SNUG) MEETING

    The SciNet Users Group (SNUG) meetings are every month on the second Wednesday, and involve pizza, user discussion, feedback, and one or two short talks on topics or technologies of interest to the SciNet community.

    This time, we will have

    • TechTalk by Ramses van Zon (SciNet) on

      “Intel Math Kernel Library”

      The Intel MKL is a highly optimized, high-performance mathematical library with C, C++ and Fortran, containing among other things implementations for BLAS, LAPACK and FFTW. It is available on the GPC, which has Intel chips for which the MKL is tuned, and has become rather easy to use in recent version of the Intel compiler suite. We will give an overview of the capabilities of this library and how you can use it in your own code.

    • User discussion
    • Pizza!

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/49

  • Mar 16: Final lecture of SCIENTIFIC COMPUTING 3: HIGH PERFORMANCE SCIENTIFIC COMPUTING
  • Mar 26, 2:00 pm – 4:30 pm: INTRO TO THE LINUX SHELL

    Extremely useful for new users of SciNet that are not yet familiar with the Linux shell or other unix prompt systems.

    For more information and sign up, go to https://support.scinet.utoronto.ca/courses/?q=node/64

  • Mar 28, noon – 1:30 pm: INTRO TO SCINET

    Learn what SciNet resources are available, how to recompile your code and how to use the batch system, in approximately 90 minutes.

    Intended for new users, but experienced users may still pick up some valuable pointers.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/63

  • SciNet is a local seminar location for the Coast-to-Coast seminar series. Future Dates/times: Mar 20 and Apr 3. 2:30 to 3:30 EST. More info at www.irmacs.sfu.ca/events/coast-coast-seminars.
  • Apr 11/May 9, at noon: FUTURE SNUG MEETINGS

    April’s TechTalk will be by Scott Northrup (SciNet) on differences between multinode mpi jobs using ethernet and infiniband.

    We are still looking for users willing to giving a short talk (20-30 minutes) at the May SNUG about interesting work that they did on SciNet clusters and how they did it! If you are up for it, email support@scinet.utoronto.ca.

    More info on future SNUGs and sign-up at https://support.scinet.utoronto.ca/courses/?q=node/50 (Apr) https://support.scinet.utoronto.ca/courses/?q=node/51 (May)

  • Apr 23: INTRODUCTION TO SCIENTIFIC C++

    This is a one-day course that will introduce you to the various features of C++ with a focus on those that are useful for scientific software development. We will take the C-to-C++ route, so familiarity with C, in particular with pointers, is a prerequisite. We will cover:

    • a basic refresher of C;
    • the nice features of C++ (“a better C”);
    • object oriented programming (classes, inheritance, …);
    • very basic generic programming with templates;
    • a discussion of some useful libraries out there.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/62

SYSTEM CHANGES

  • GPC: Due to some changes we are making to the GigE nodes, if you run multinode ethernet MPI jobs, you still will need to explicitly request the ethernet interface in your mpirun:

    For Openmpi: mpirun –mca btl self,sm,tcp For IntelMPI: mpirun -env I_MPI_FABRICS shm:tcp

    There is no need to do this if you run on IB, or if you run single node mpi jobs on the ethernet (GigE) nodes. Please check the wiki page on ‘GPC MPI Versions’ for more details. We expect these changes to be finished in the next month.

  • GPC: A new version of the Intel compiler suite has become the default module. The C/C++ and fortran compilers in this suite are at version 12.1.3, while the MKL library is at version 10.3.9.
  • GPC: New versions of parallel-netcdf and mpb have been installed.

ADDED TO THE WIKI IN FEBRUARY/EARLY MARCH 2012

All new wiki content below is listed and linked on the main page: http://wiki.scinethpc.ca/wiki/index.php/SciNet_User_Support_Library#What.27s_New_On_The_Wiki)

  • The slides of the lectures of part III of the Scientific Computing Course on “Numerical Tools for Physical Scientists”.
  • The list of modules installed on the Power 7 Linux cluster.

WHAT ELSE HAPPENED AT SCINET IN FEBRUARY/EARLY MARCH 2012?

  • Feb 11: SNUG meeting was held, with a TechTalk by Jonathan Dursi on “Tuning your MPI application without writing code: mpitune and otpo”.
  • Feb 10, 17 and March 2: Three lectures of part 3 of SciNet’s Scientific Computing course on “High Performance Scientific Computing” were given.
  • Thu Feb 23 and Fri Feb 24: SciNet hosted and co-taught a Software Carpentry scientific computing boot-camp.

SciNet News February 2012

February 5, 2012 in for_researchers, for_users, newsletter

EVENTS COMING UP

  • Thu Feb 9, 9:00 am: SCHEDULED DOWNTIME OF THE SCINET CLUSTERS

    To mitigate some of the file system problems, there will be a relatively short downtime of all SciNet systems on Thursday to perform a reconfiguration. The downtime is expected to last approximately two hours. Check the wiki for updates.

  • Wed Feb 8, 12:00 noon: SCINET USER GROUP (SNUG) MEETING

    The SciNet Users Group (SNUG) meetings are every month on the second Wednesday, and involve pizza, user discussion, feedback, and one or two short talks on topics or technologies of interest to the SciNet community.

    This time, we will have

    • TechTalk by Jonathan Dursi (SciNet) on

    “Tuning your MPI application without writing code: mpitune and otpo”

    MPI libraries are very complicated packages, with many tunable parameters that affect their behaviours. These parameters are set to reasonable default settings that should make sense for most applications; but sometimes modest adjustments to these settings can improve the performance of your code. We’ll discuss automated tools for the IntelMPI and OpenMPI libraries which allow testing large numbers of these parameters and how they can help you improve performance of your code.

    • User discussion
    • Pizza!

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/48

  • Fri Feb 10,17, Mar 2,9: SCIENTIFIC COMPUTING 3: HIGH PERFORMANCE SCIENTIFIC COMPUTING

    Part III of SciNet’s Scientific Computing course. These parts can be taken as “mini-courses” or “modular courses” by astrophysics and physics graduate students.

    More info: wiki.scinethpc.ca/wiki/index.php/High_Performance_Scientific_Computing

    Sign-up:
    https://support.scinet.utoronto.ca/courses/?q=node/45

  • Thu Feb 23 and Fri Feb 24: SciNet will be hosting and co-teaching a Software Carpentry scientific computing boot-camp during reading week.

    Since 1998, Software Carpentry has taught scientists and engineers the skills and tools they need to use computing more productively. Thanks to a grant from the Sloan Foundation, we are running two-day workshops at selected institutions, followed by 4-8 weeks of self-paced online learning. Each workshop will cover:

    • Using the Unix shell to get more done in less time
    • Using version control to manage and share information
    • Basic Python programming
    • How (and how much) to test programs
    • Working with relational databases

    The online follow-up will go deeper into these topics, and also touch on program design and construction, matrix programming, data management, and development life cycles for small research teams.

    Registration details to follow; keep an eye on https://support.scinet.utoronto.ca/courses/

  • SciNet is a local seminar location for the Coast-to-Coast seminar series. Dates/times: Feb 21, Mar 6, Mar 20, Apr 3. 2:30 to 3:30 EST. More info at www.irmacs.sfu.ca/events/coast-coast-seminars.
  • Mar 14/Apr 11/May 9, at noon: FUTURE SNUG MEETINGS

    We are still looking for users (students, postdocs, staff, faculty, it does not matter) willing to giving a short talk (20-30 minutes) about interesting work that they did on SciNet clusters and how they did it! If you are up for it, email support@scinet.utoronto.ca.

    More info on future SNUGs and sign-up at https://support.scinet.utoronto.ca/courses/?q=node/49 (Mar) https://support.scinet.utoronto.ca/courses/?q=node/50 (Apr) https://support.scinet.utoronto.ca/courses/?q=node/51 (May)

SYSTEM CHANGES

  • GPC: Due to some changes we are making to the GigE nodes, if you run multinode ethernet MPI jobs, you will need to explicitly request the ethernet interface in your mpirun:

    For Openmpi: mpirun –mca btl self,sm,tcp

    For IntelMPI: mpirun -env I_MPI_FABRICS shm:tcp

    There is no need to do this if you run on IB, or if you run single node mpi jobs on the ethernet (GigE) nodes. Please check the wiki page on ‘GPC MPI Versions’ for more details.

  • The new Resource Allocations have taken effect on January 9 for groups who were awarded an allocation. This includes storage allocations.
  • Note that the ‘diskUsage’ command from the ‘extras’ module can be used to query your disk usage, your group’s disk usage, and the quotas (including number of files), for each of the file systems that you have access to.
  • For group with storage allocations, we will start to make backups of project. This is possible since most material now resides on HPSS. Note that this backup system does not keep full snapshots of the past, but keeps a copy of the last version of the files. So, if any data accidentally gets deleted and you contact us quickly, it can be restored.
  • GPC: On January 30th, CentOS 5 was phased out.
  • GPC: A more recent module for valgrind/3.7.0 was installed which includes valkyrie, a visualization tool for memcheck.
  • GPC: A module for scalapack/2.0.1 was installed
  • GPC: A newer version of R was installed as module R/2.14.1 (users have to explicit request this version, 2.13.1 is still the default).
  • GPC: Newer versions of the GSL was installed as modules gsl/1.15-gsl and gsl/1.15-intel (these are also not the default yet).
  • A milestone was reached on Sunday February 5th when the 10,000,000th job ran on SciNet. It started at 4:31 am and ran for 2 hours 13 minutes and 31 seconds. It was a job from the ATLAS project (http://atlas.ch).

ADDED TO THE WIKI IN JANUARY 2012

All new wiki content below is listed and linked on the main page: 
http://wiki.scinethpc.ca/wiki/index.php/SciNet_User_Support_Library#What.27s_New_On_The_Wiki)
  • The slides of the lectures of part II of the Scientific Computing Course on “Numerical Tools for Physical Scientists”.
  • The page on ‘GPC MPI Versions’ was updated.
  • Information about part III of the Scientific Computing Course on “High Performance Scientific Computing”.

WHAT ELSE HAPPENED AT SCINET IN JANUARY?

  • Jan 9: “Intro to the Linux shell” session was given.
  • Jan 11: “Intro to SciNet” session was held.
  • Jan 11: SNUG meeting was held, with a TechTalk by Chris Neale on “Kinetics of Hydrophobic Gating and Energetics of Magnesium Permeation in the Bacterial Divalent Cation Transport System CorA”
  • Jan 13,20,27, Feb 3: Part 2 of SciNet’s Scientific Computing course on “Numerical Tools for Physical Scientists” was given.

SciNet News January 2012

January 5, 2012 in for_researchers, for_users, newsletter

SYSTEM CHANGES

  • The new Resource Allocations will take effect on January 9, for groups who were awarded an allocation.
  • There will be a scheduled down-time on January 17/18. We will take that opportunity to update the OS of the login nodes to CentOS 6.
  • On January 30th, CentOS 5 will be phased out. If you are still doing comparison runs between the two OSs, make sure that they are done before then.
  • The “diskUsage” command has been improved and its output has been simplified.

EVENTS COMING UP

  • Mon Jan 9, 2:00 pm – 4:00 pm: INTRO TO THE LINUX SHELL

    Extremely useful for new users of SciNet that are not yet familiar with the Linux shell or other unix prompt systems.

    For more information and sign up, go to https://support.scinet.utoronto.ca/courses/?q=node/55

  • Wed Jan 11, 10:00 am – 11:30 am: INTRO TO SCINET

    Learn what SciNet resources are available, how to recompile your code and how to use the batch system, in approximately 90 minutes.

    Intended for new users, but experienced users may still pick up some valuable pointers.

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/56

    Note that to the intro may be interested in attending the following event:

  • Wed Jan 11: SCINET USER GROUP (SNUG) MEETING

    The SciNet Users Group (SNUG) meetings are every month on the second Wednesday, and involve pizza, user discussion, feedback, and a one or two short talks on topics or technologies of interest to the SciNet community.

    This time, we will have

    • TechTalk by Chris Neale: “Kinetics of Hydrophobic Gating and Energetics of Magnesium Permeation in the Bacterial Divalent Cation Transport System CorA” (co-authors: Pawel Pomorski, Nilmadhab Chakrabarti, Emil F. Pai and Régis Pomès)
    • User discussion
    • Pizza!

    Sign up at https://support.scinet.utoronto.ca/courses/?q=node/47

  • Feb 8/Mar 14/Apr 11/May 9, at noon: FUTURE SNUG MEETINGS

    We are still looking for users (students, postdocs, staff, faculty, it does not matter) willing to giving a short talk (20-30 minutes) about interesting work that they did on SciNet clusters and how they did it! If you are up for it, email support@scinet.utoronto.ca.

    More info on future SNUGs and sign-up at https://support.scinet.utoronto.ca/courses/?q=node/48 (Feb) https://support.scinet.utoronto.ca/courses/?q=node/49 (Mar) https://support.scinet.utoronto.ca/courses/?q=node/50 (Apr) https://support.scinet.utoronto.ca/courses/?q=node/51 (May)

  • Fri Jan 13,20,27, Feb 3: SCIENTIFIC COMPUTING 2: NUMERICAL TOOLS FOR PHYSICAL SCIENTISTS

    Part II of SciNet’s Scientific Computing couse. These parts can be taken as “mini-courses” or “modular courses” by astrophysics and physics graduate students.

    Topics of part II: Modelling, floating point computations, validation + verification, ODEs, Monte Carlo, linear algebra, FFT.

    More info: wiki.scinet.utoronto.ca/wiki/index.php/Numerical_Tools_for_Physical_Scientists

    Sign-up: https://support.scinet.utoronto.ca/courses/?q=node/44

  • Fri Feb 10,17, Mar 2,9: SCIENTIFIC COMPUTING 3: HIGH PERFORMANCE SCIENTIFIC COMPUTING

    Part III of SciNet’s Scientific Computing couse. These parts can be taken as “mini-courses” or “modular courses” by astrophysics and physics graduate students.

    Topics of part II: Profiling, optimization, openmp, mpi and hybrid programming.

    More info: https://support.scinet.utoronto.ca/courses/?q=node/39 Sign-up: https://support.scinet.utoronto.ca/courses/?q=node/45

ADDED TO THE WIKI IN DECEMBER 2011

All new wiki content below is listed and linked on the main page:
http://wiki.scinethpc.ca/wiki/index.php/SciNet_User_Support_Library#What.27s_New_On_The_Wiki)

  • Information about part II of the Scientific Computing Course on “Numerical Tools for Physical Scientists”.
  • SNUG Techtalk Dec 2011 “Intel Compiler Optimizations”
  • Updates on the GPC transition to CentOS 6.

WHAT ELSE HAPPENED AT SCINET IN DECEMBER 2011?

  • On Dec 14 a SciNet User Group (SNUG) Meeting was held with a TechTalk on “Compiler and optimization flags on the GPC” by Scott Northrup from SciNet.
  • On Dec 15, a day long intro course on GPGPU programming with CUDA was given.
  • A power outage happened at the data center on Dec 28, 2011.