Tutorials are meant to compliment the OCEANS technical program. They do so by describing the fundamental elements of a technology and/or rudiments of a subject in a classroom setting. In this way, participants have the opportunity to learn the how-and-why of a particular subject or field of technology. Tutorial sessions are held on Monday, September 20, 2010, the day before the formal opening of the conference. They are given in the core OCEANS technology areas listed in the Technical Program section of this web site.
The tutorials are essentially special classes describing the fundamental elements of a technology subject. They may be full or half-day sessions. There is also a benefit to speakers. Delivering a tutorial allows them an occasion to share their knowledge and expertise with other members of the ocean engineering/marine technology community.
Registration for tutorials is done through the conference registration page. You may register for the tutorials alone, or as part of a larger registration for the conference.
If you would like to correspond with the Chair of the Tutorial session, please email:
tutorials@oceans10mtsieeseattle.org.
NOTES: The Training Class "The Principles of Underwater Lighting" is NOT being offered at OCEANS'10. Additionally, the The Tutorial T2 has been cancelled. Unfortunately, these two cancellations were made after the paper copy of the OCEANS'10 program book was printed. The PDF version of the OCEANS'10 Program Book will have the correct information prior to the conference.
PRICING
Tutorial Prices (in US dollars) for IEEE and MTS members, non-members, and students are given below. The deadline for the early registration discount is August 15th (on-line or by mail prior 11:59 PM EDT).
CATEGORY |
EARLY REGISTRATION |
LATE REGISTRATION |
|
Half-day / Full-day |
Half-day / Full-day |
Members |
$175 / $250 |
$200 / $300 |
Non-Members |
$200 / $300 |
$225 / $350 |
Students |
$75 / $100 |
$100 / $150 |
SCHEDULE
Given below is the schedule of the tutorials. All take place on Monday, September 20, 2010. Click on the tutorial title to go to the details of that tutorial.
MONDAY : 8:30 - 17:00 | |
T2 | CANCELLED (SonarWiz 5: Seafloor Survey Acquiring and Processing Data) |
MONDAY : 8:30 - 12:00 | |
T4 | The Stochastic Matched Filter and its applications to detection and de-noising |
T5 | Ocean Wave Measurement and Analysis |
MONDAY : 13:00 - 17:00 | |
T1 | Working with a cabled observatory: How Oceans 2.0 will help your science in the face of so much more data |
T3 | Acoustic Seabed Classification with Multibeam and Sidescan Images |
TUTORIAL DESCRIPTIONS
Details of the tutorials are given below:
T1. Working with a Cabled Observatory: How Oceans 2.0 Will Help Your Science in the Face of So Much More Data
Instructor: Benoît Pirenne
Session length: Half day / afternoon
Large cabled observatories are being deployed around the world and represent a foray of Ocean Sciences in the realm of “Big Science”. Just as happened to physicists and astronomers a couple of decades ago, ocean scientists now have to brace for a very different way to perform their tasks: orders of magnitude more data, a different approach to obtaining data, the real time nature of data arrival, the immediate public nature of all data exemplifies the transformational impact that those new infrastructures will have on individuals involved with the study of our oceans.
Today, large particle accelerators and large telescopes are used remotely by scientists who sometimes only get the outcome of their requested experiments without ever seeing the machinery used to perform them. Similarly ocean scientists involved with large real-time observatories now mostly work from a distance, programming their experiments, interacting with the underwater assets and with the resulting data from their laboratory or perhaps from the airport waiting lounge, using a powerful cyber-infrastructure as a proxy to the underwater assets.
The goal of this tutorial is to demonstrate some of the tools that are now available to users of the NEPTUNE Canada (NC) cabled observatory as an example of what the science community can expect from the other nascent systems, such as the Ocean Observatory Initiative (OOI) in the US. User features of the cyber-infrastructure serving NEPTUNE Canada and VENUS will be exercised live and participants, using their own laptops, will be able to perform operations with NC data and other assets. The tutorial will start with a backgrounder on the technologies and the reasons for their selection. It will then go on to some of the observatory management capabilities offered to the operation staff to illustrate some of the challenges that a large scale facility represents. Finally the “normal” user tools will be covered in details.
The user interface, part of the cyber-infrastructure supporting NC and VENUS is called “Oceans 2.0” as it rests on many of the concepts of the Web 2.0 movement, typically “collaboration” and “contribution”. With data volumes swelling so fast while the amount of scientists remains constant at best, the only way to address the situation is to provide scientists with better tools, to allow them to easily collaborate to reach their science goals as well as to allow many individuals (or external intelligent software agents) to contribute to the data content.
The tutorial program will include:
Background on the observatory infrastructure:
- physical view
- network view
- logical (topological) view
Management functions of the cyber-infrastructure:
- power management
- network management and cyber-security
- data flow management
Data and metadata access function in Oceans 2.0 are performed using web interfaces. Tools exists to:
- perform data search
- visualize data (e.g., plotting utility)
- explore metadata
- very efficiently browse through thousands of hours of video files collected over time by the observatory and annotated by users. (SeaTube)
Presenter’s Bio – Benoît Pirenne
Benoît Pirenne is NEPTUNE Canada Associate Director, Information Technology at the University of Victoria since October 2004. He is in charge of all Data Management and Archiving aspects (from system development to operations) of both the VENUS and NEPTUNE Canada cabled ocean observatories. He directs a group of over 20 computer professionals organized in three teams ranging from development to software QA/QC to production and operations. He is also a member of the NEPTUNE Canada Executive Team and helps provide directions for the future of the infrastructure.
Previously, Benoît Pirenne spent 18 years at the European Southern Observatory (ESO, Munich, Germany), a world leading organization for astronomical research. At ESO Benoît assumed a number of scientific and technical positions ranging from Hubble Space Telescope Archive Scientist to Head of the Operations Technical Support Department in this Organization. In this latter role he was responsible for running the Data Management and Archiving system supporting both ESO's telescopes and NASA/ESA's Hubble Space Telescope, leading a team of over 15 individuals.
Benoît has a BS in computer science from Liège, Belgium, and a Master in computer science from the University of Namur, Belgium.m
T2. Cancelled.
T3. Acoustic Seabed Classification with Multibeam and Sidescan Images
Instructor: Jon Preston
Session length: Half day / afternoon
The tutorial presents theory and applications of image-based acoustic classification, from the early papers through to recent applications. It is organized in four parts of roughly equal emphasis and importance.
- The theoretical portion contains the concepts of acoustic backscatter from the seabed. Seabed sediments are characterized acoustically by roughness and acoustic impedance. Interface scattering strength depends on grazing angle and on sediment type. Volume scattering, which is inseparable from interface scattering, also depends on sediment type. The emphasis is on the concepts of how sediment information appears in echoes rather than on details of backscatter theory.
- The section on classification methods starts with essential concepts such as the ambiguities in inverting acoustic backscatter theory and the resulting emphasis on phenomenological methods of seabed classification. The present terminology is seabed characterization by inversion vs. seabed classification by empirical methods. The principles of classifying with echo time series from single beam sounders at normal incidence are presented next, since they were first historically and since they contain many of the concepts used in classification with sonar images.
- The third part moves heavily into sonar images, starting with a review of how they are assembled from backscatter in both sidescan and multibeam sonars. Then it is demonstrated that near nadir the amplitudes and shapes of sounder echoes are rich in sediment information but away from vertical incidence echoes carry sediment information in their amplitudes and their noise characteristics, but not in their shapes. Echoes from imaging sonars, with their wide horizontal beamwidths, become rasters in sonar images, so noise in these echoes becomes image texture. Classification with images requires features that depend on image amplitude, noise, and texture; examples of these are given. The most important of these are derived from gray-level co-occurrence matrices.
- The final part covers multivariate statistical processing of the features that capture acoustic character through to making class maps in a GIS. The steps are dimension reduction through principal components analysis or other methods, feature space, clustering both manual and objective, and categorical interpolation.
The intended audience is engineers and scientists who want an understanding of echo and image character or who want to learn about feature-based statistical processing, as well as those who want to make maps of seabed classes. Participants in this tutorial can expect to gain a thorough understanding of the principles and practice of image-based sediment classification.
Presenter’s Bio – Jon Preston:
Jon Preston has twelve years experience as a research scientist with the Department of National Defence, Canada. At Defence Research Establishment Pacific (DREP), Victoria, BC, he developed mine countermeasures technology, particularly processing of high-frequency imagery, and towfish altitude and position measurement and control. Much of the work done by DREP mine countermeasures group can be seen in the route survey payload of the 12 Maritime Coastal Defence Vessels operated by the Canadian Navy.
Starting about 1996, his research interests broadened to include both acoustic and invasive sensing of sediments, and he led several multi-faceted research cruises focusing on sediments. During his years at DREP he published over 20 internal reports and 10 papers in conference proceedings. He was technical chairman of the IEEE OCEANS 93 conference. He joined Quester Tangent Corporation in late 1998 and is a senior scientist. He leads the research and development into acoustic seabed classification. QTC 5, a system for classification in very shallow water using echosounders, has been well accepted and is widely used. He spearheaded the development of the swath classification software QTC MULTIVIEW and QTC SIDEVIEW for multibeam and sidescan processing. Both these products are widely used and accepted throughout the industry.
Dr. Preston’s B.Sc. is from McMaster University, 1970. His Pd.D. work, in the plasma physics group at the University of British Columbia (UBC), dealt mostly with sensors, diagnostic equipment, and signal processing. Between graduation from UBC in 1974 and moving to DREP in 1987, he was responsible for Canada’s research and development program in automatic detection of chemical warfare compounds. This work led to a family of new detectors used in many NATO armies, 25 internal reports, 4 journal papers, and 2 patents.
T4. The Stochastic Matched Filter and Its Applications to Detection and De-noising
Instructor: Philippe Courmontagne
Session length: Half day / morning
In several domains of signal processing, such as detection or de-noising, it may be interesting to provide a second-moment characterization of a noise-corrupted signal in terms of uncorrelated random variables. Doing so, the noisy data could be described by its expansion into a weighted sum of known vectors by uncorrelated random variables. Depending on the choice of the basis vectors, some random variables are carrying more signal of interest information than noise ones. This is the case, for example, when a signal disturbed by a white noise is expanded using the Karhunen-Loève expansion. In these conditions, it is possible either to approximate the signal of interest by keeping only its associated random variables, or to detect a signal in a noisy environment with an analysis of the random variable power. The purpose of this tutorial is to present such an expansion, available for both the additive and multiplicative noise cases, and its application to detection and de-noising. This noisy random signal expansion is known as the stochastic matched filter, where the basis vectors are chosen so as to maximize the signal to noise ratio after processing.
This tutorial is divided into three parts:
- The first part concerns the theory itself: the stochastic matched filter theory will be described for 1-D discrete-time signals and its extension to 2-D discrete-space signals. Furthermore, a study will be realized on two different noise cases: the white noise case and the speckle noise case.
- In the second part, the stochastic matched filter will be described in a detection context and this method will be confronted with signals resulting from underwater acoustics. The results obtained are then compared with those resulting from the classical matched filter theory.
- In the last part, the stochastic matched filter will be presented in a de-noising context. The de-noising being performed by a limitation to order Q of the noisy data expansion, two criteria to determine Q will be introduced. Experimental results on real SAS data are given to evaluate the performances of such an approach.
This tutorial is intended for people or scientists connected with 1-D/2-D signal or array processing, and interested to have a fly-over about these effective methods.
Presenter’s Bio - Philippe Courmontagne
Philippe Courmontagne was born in 1970. He received the Ph. D. degree in Physics at the University of Toulon (France) in 1997. In 1999, he became Professor in a French electronic engineering school: the Institut Supérieur de l’Électronique et du Numérique (ISEN Toulon, France), in the field of signal processing and image processing. He joined in 2001 the Provence Materials and Microelectronics Laboratory (L2MP UMR CNRS 6137), which is a unit of the French national research center (CNRS). In 2005, he obtained his Habilitation (HDR - Habilitation to Supervise Research) for his works in the field of noisy signal expansion. In 2007, he has been elected to the degree of IEEE Senior Member in recognition of professional standing for his works in the field of signal de-noising (SAR, SAS images), signal detection in noisy environment and signal transmission.
T5. Ocean Wave Measurement and Analysis
Instructor: Theodore Mettlach
Session length: Half day / morning
The ocean covers over 70 percent of Earth’s surface. Surface waves, ranging from newly formed ripples to monster waves generated inside super typhoons or by earthquakes can challenge even the most stalwart among us venturing to sea or living on its coasts. This short course will give to those attending a comprehensive picture of the state of the art for making operational ocean wave measurements, including the analysis of wave data for engineering and scientific purposes.
The course is intended for scientists, engineers and students interested in ocean waves, for those looking for information on wave sensors, and for those seeking to begin or expand a program for wave measurements in deep water or in the coastal zone.
The course is in five parts over four hours. First, the course begins with introductory cases studies affording dramatic examples of the danger inherent in ocean waves, including a look at the highest waves ever measured.
Second, the course turns to fundamental wave theory, including descriptions of linear, Airy waves, Stokes non-linear waves and solitons, Kortweg-DeVries waves. It presents various wave spectra for the engineer, such as the well known Pierson and Moskowitz spectrum and the JONSWAP fetch-dependent spectrum. It also gives theories explaining the generation, propagation and decay of waves. The course will address waves in shallow water, wave interaction with ocean currents and criteria used to separate sea waves from swell waves.
Third, it examines aspects of various measurement techniques with respect to relative cost, areal coverage, accuracy and timeliness. These include the following: visual observations of waves; wave staffs, including the laser wave staff, bottom-mounted pressure sensors, acoustic Doppler sensors; pitch-roll-heave and particle-following buoys; airborne lidar; marine radar, space-borne synthetic aperture radar (SAR) and deep pressure measurements for detecting tsunamis. Emphasis will be given to deep water wave buoys, which, at relatively low cost, provide long-term, accurate pinpoint measurement.
Fourth, the course looks at standard techniques for transforming raw data into wave measurements. These techniques include spectral and zero-crossing analysis. Attention is given to some of the limits inherent in buoy directional wave measurements, which may be described as a product of no more than one and a half degrees of freedom, and which require various assumptions about how energy is spread across directions. Descriptions of the maximum likelihood and maximum entropy techniques for resolving wave direction are given and compared against the original method of Longuet-Higgens. Included among analysis techniques are those for computing extreme waves, the so-called 50- and 100-year wave. Also included is a description of recent work done in characterizing the wave field in a four-dimensional sense.
Finally, in the fifth part of the course, we look at the need for calibration of sensors and the need to validate, verify and extensively test newly developed systems. Some of the work presently being done at the National Data Buoy Center is used as an example showing the level of detail that must be reached to reduce noise, ensure accuracy and refine sensitivity of measurements. These efforts are necessary to the development of protocols and reference standards applicable throughout the wave-measuring community.
Presenter’s Bio - Theodore Mettlach
Theodore Mettlach III was born in the beautiful city of Geneva, New York and raised in the capital cities Tokyo, Washington, and Ottawa. In Canada he attended St. Patrick’s College High School. After getting his fill of flying in and jumping out of U.S. Navy helicopters, he entered the meteorology program at Florida State University, finishing it in three years. He returned to the Navy in 1979 as a Geophysics Officer and completed the Master of Science in Meteorology and Oceanography curriculum at Naval Postgraduate School in 1985. For his performance there, Rear Admiral J. R. Seesholtz gave him the 1985-1986 Oceanographer of the Navy Award for Academic Excellence in the Air-Ocean Sciences, and Sigma Xi, the Scientific Research Society, elected him an associate member.
After serving as Staff Oceanographer for Commander, Joint Task Force, Middle East in the Persian Gulf during peak hostilities of the Iran-Iraq War, and immediately following promotion, Lieutenant Commander Mettlach resigned his regular commission so he could pursue private interests. He joined the NDBC Technical Services Contract, held by Computer Sciences Corporation, in 1991. That year the American Meteorological Society certified him Consulting Meteorologist (CCM #486). NTSC program manager Mr. Terry Towles nominated him jointly, with others, to receive the company’s highest award for technical excellence in getting NDBC’s first buoy-mounted ADCP to work. Also, he received an honorarium for CSC Applied Technology Division best 1994 scientific paper. The late Dr. James Paquin, founder of the company, asked him to join Neptune Sciences Inc., now the Neptune Division of Planning Systems Inc., in 1995 for a series of modeling efforts with the Naval Research Laboratory, mostly focusing on wave, surf and tide models. During his term at Neptune he maintained connections to NDBC, providing data analysis and software development services.
Mr. Mettlach remained in the Navy Reserve for a while, augmenting CINCUSNAVEUR, London, attaining rank of Commander. He joined SAIC with the position of Senior Systems Engineer in July 2005 with the goal of carrying on the NDBC high tradition of providing world class directional ocean wave measurements. He presently serves as NTSC Chief Scientist and has recently been involved in designing, developing and testing mechanical wave simulators for precision calibration of micro-electromechanical accelerometers used in the NDBC Digital Directional Wave Module, which has a patent pending. His interests span the range of activity at NDBC, from system planning to field testing, from software development to detailed data analysis. He has authored dozens of government reports and articles in conference proceedings and three papers in the refereed scientific literature. In 2009 he was elected an SAIC Technical Fellow and is presently a member of the SAIC Technical Fellows Council. He divides his time between science and engineering work in southern Mississippi and, since Hurricane Katrina, family in Boca Raton, Florida.