Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2017 Aug 19;4(8):506–539. doi: 10.1002/2016EA000252

The Mars Science Laboratory (MSL) Mast cameras and Descent imager: Investigation and instrument descriptions

Michal C Malin 1,, Michael A Ravine 1, Michael A Caplinger 1, F Tony Ghaemi 2, Jacob A Schaffner 1, Justin N Maki 3, James F Bell III 4, James F Cameron 5, William E Dietrich 6, Kenneth S Edgett 1, Laurence J Edwards 7, James B Garvin 8, Bernard Hallet 9, Kenneth E Herkenhoff 10, Ezat Heydari 11, Linda C Kah 12, Mark T Lemmon 13, Michelle E Minitti 14, Timothy S Olson 15, Timothy J Parker 3, Scott K Rowland 16, Juergen Schieber 17, Ron Sletten 9, Robert J Sullivan 18, Dawn Y Sumner 19, R Aileen Yingst 14, Brian M Duston 1, Sean McNair 1, Elsa H Jensen 1
PMCID: PMC5652233  PMID: 29098171

Abstract

The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam‐34 has an f/8, 34 mm focal length lens, and the M‐100 an f/10, 100 mm focal length lens. The M‐34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M‐100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M‐34 can focus from 0.5 m to infinity, and the M‐100 from ~1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the ~2 m tall Remote Sensing Mast, have a 360° azimuth and ~180° elevation field of regard. Mars Descent Imager is fixed‐mounted to the bottom left front side of the rover at ~66 cm above the surface. Its fixed focus lens is in focus from ~2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of ~70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression.

Keywords: Mars, cameras, Mastcam, MARDI, Curiosity, MSL

Key Points

  • The Mars Descent Imager, an f/3 9.7 mm, 2 M pixel color camera operated autonomously during landing taking a descent video at 4 frames/second

  • Mastcam‐34 f/8, 34 mm camera takes <1600 × 1200 pixel images in broad and narrowband color over a field 20° × 15° at a scale of 218 μrad/pixel

  • Mastcam‐100 f/10, 100 mm, f/10 takes <1600 × 1200 pixel images in broad and narrowband color over a field 6.8° × 5.1° at 74 μrad/pixel scale

1. Introduction

1.1. Scope

The Mars Science Laboratory (MSL) Curiosity rover landed on Aeolis Palus in northern Gale crater (latitude 4.5895°S, longitude 137.4417°E) on 6 August 2012 [Vasavada et al., 2014]. The mission's main objective is to assess the past and present habitability of Mars, with particular emphasis on exploring environmental records preserved in strata exposed by erosion in western Aeolis Palus and northwestern Aeolis Mons, also known as Mount Sharp [Grotzinger et al., 2012; Vasavada, 2016].

Curiosity has 17 cameras. These include four navigation and eight hazard cameras [Maki et al., 2012] arranged as redundant pairs, the Mars Hand Lens Imager (MAHLI) [Edgett et al., 2012], the ChemCam Remote Microscopic Imager [Le Mouélic et al., 2015], two Mast Cameras (Mastcams), and the Mars Descent Imager (MARDI).

This paper describes MSL Mastcam and MARDI (Figure 1) investigation objectives, instrumentation, and MARDI calibration. Not covered here are the following:

  1. Mastcam calibration, presented by Bell et al. [2017].

  2. Mastcam, MAHLI, and MARDI data and data products; these are archived with the National Aeronautics and Space Administration (NASA) Planetary Data System (PDS), and Malin et al. [2015], Edgett et al. [2015], and Bell et al. [2017] described these products.

  3. Science results from the MARDI Entry, Descent, and Landing (EDL) and surface observations, and the science results from the use of the Mastcam throughout the surface mission.

Figure 1.

Figure 1

Curiosity rover in cleanroom facilities at JPL‐Caltech on 4 April 2011 with the locations of the Mastcam and MARDI camera heads and Mastcam calibration target indicated. The digital electronics assembly (DEA) hardware for these cameras is housed inside the rover body in a thermally controlled environment. In this view, support equipment is holding the rover slightly off the floor. When the wheels are on the ground, the top of Curiosity's mast is about 2.2 m above ground level. This is a portion of NASA/JPL‐Caltech image PIA13980.

1.2. Background

Grotzinger et al. [2012] described the MSL mission, hardware, instrumentation, and science investigation. The Curiosity rover is a mobile laboratory that, as of sol 1550 (14 December 2016), had driven 15.01 km (onboard odometry) from its landing site. The rover has tools to extract particulate samples by drilling into rocks and by scooping regolith and eolian fines [Anderson et al., 2012]. Dust can be brushed from rock surfaces using the rover's Dust Removal Tool [Davis et al., 2012]. Images of surface disturbances created by rover wheels also contribute to science [Haggart and Waydo, 2008; Sullivan et al., 2011; Arvidson et al., 2014): wheels can make trenches to probe eolian deposits, rocks can be accidentally or purposefully broken or scraped, and wheels leave tracks that provide information about regolith properties and eolian transport.

Many of the results of the first 2.3 Mars years of the MSL mission centered on science for which images acquired by 11 of the rover's cameras are central. These include identification of rock types, compositions, facies, and stratigraphic relations; rock diagenetic textures, structures, and metasomatic features; the geomorphic configuration of the landscape as it relates to rock properties, erosion history, and rover traverse planning; studies of regolith and present‐day windblown sediments; and investigations of the Martian atmosphere and meteorology.

Images from Curiosity's cameras and instruments on NASA's orbiting assets have also been vital for enabling the overall scientific success of the MSL mission, including mosaics used to plan where to drive the rover, where to stop and observe geologic materials more closely in the rover's robotic arm workspace, and where to extract samples using the rover's drill or scoop. Images are also used to monitor the condition of rover and instrument hardware, such as wheels, drill bits, instruments on the robotic arm, and sample inlet ports.

2. Science and Science‐Enabling Objectives

Investigation objectives are important, as they, in part, provide science guidance to the engineering effort of designing, building, and calibrating the investigation instrumentation. The “flow‐down” from objectives to functional requirements, from functional requirements to design requirements, and eventually to demonstrated capabilities is, perhaps, the single most important aspect of the investigation objectives.

The Mastcam and MARDI investigations were proposed separately to NASA in July 2004 in response to an April 2004 Announcement of Opportunity that solicited science investigations for the MSL mission. NASA selected Mastcam and MARDI, together with MAHLI, in December 2004. These three camera‐based investigations would share common instrument design elements and labor (staffing).

2.1. Mastcam

The primary objective of the Mastcam investigation is to acquire and use images of the environment in which the MSL rover is operational to characterize and determine details of the history and processes recorded in geologic material at the MSL field site, particularly as these details pertain to the past and present habitability of Mars. More specifically, the Mastcam investigation was designed to address the following objectives:

  • 1

    Observe landscape physiography and processes. Provide data that address the topography, geomorphology, geologic setting, and the nature of past and present geologic processes at the MSL field site.

  • 2

    Examine the properties of rocks. Observe rocks (outcrops and clasts ≥4 mm) and the results of interaction of rover hardware with rocks (e.g., drilling, brushing, and wheel breakage) to help determine morphology, texture, structure, mineralogy, stratigraphy, rock type, history/sequence, and depositional, diagenetic, and weathering processes for these materials.

  • 3

    Study the properties of fines. Examine Martian fines (clasts <4 mm) to determine the processes that acted on these materials and individual grains within them, including physical and mechanical properties, the results of interaction of rover hardware with fines (e.g., scooping, wheel imprints, and trenches), plus stratigraphy, texture, mineralogy, and depositional processes.

  • 4

    Document atmospheric and meteorological events and processes. Observe clouds, dust‐raising events (dust devils, plumes, and storms), properties of suspended aerosols (dust and ice crystals), and eolian transport and deposition of fines. Observations of the Sun through the two solar filters at different wavelengths give the opacity of the atmospheric column at two wavelengths, a function of scattering particle size. Similar observations of other parts of the sky using the normal color filters also provide information on dust particle size and composition. Observations of the distant rim of the crater and the variation of opacity as a function of height of the measurement yield a profile of opacity (dust loading) within the lower portion of the atmospheric boundary layer.

  • 5

    Support and facilitate rover operations, sample extraction and documentation, contact instrument science, and other MSL science. To help enable overall MSL scientific success, acquire images that help determine the location of the Sun and horizon features; provide information pertinent to rover traversability; provide data that help the MSL science team identify materials to be collected for, and characterize samples before delivery to, the MSL analytical instruments; help the team identify and document materials to be examined by the robotic arm‐deployed contact science instruments; and acquire images that support other MSL instruments that may need them.

Finally, the Mastcam investigation has a sixth objective that was formulated in the July 2004 proposal to NASA, a time when the MSL landing site was unknown and could have placed the rover at a latitude as high as 60° north or south of the equator:

  • 6

    View frost, ice, and related landforms and processes. Characterize frost or ice, if present, to determine texture, morphology, thickness, stratigraphic position, and relation to regolith and, if possible, observe changes over time and also examine ice‐related (e.g., periglacial) geomorphic features.

Subsequent to the July 2004 proposal, frost was actually observed on the Mars Exploration Rover (MER), Opportunity, at an equatorial latitude [Landis and MER Athena Science Team, 2007] and has even been observed in images acquired from orbit at a latitude as low as 24°S in terrain north of Hellas in the middle of the afternoon [Schorghofer and Edgett, 2006].

2.2. MARDI

Malin et al. [2001] described the scientific rationale for acquisition of images during a descent to the Martian surface in the context of the Mars Polar Lander (MPL) Mars Descent Imager (MARDI) investigation. Given the importance of geologic context, Malin et al. [2001] said, “among the most important questions to be asked about a spacecraft sitting on a planetary surface is ‘Where is it?’”

At the time the MSL MARDI investigation was proposed, the launch of NASA's newest high‐resolution orbiter camera, the Mars Reconnaissance Orbiter (MRO) High Resolution Imaging Experiment (HiRISE), was still more than a year away and many critical events (launch, cruise, orbit insertion, and aerobraking) had to occur before the start of the November 2006 to December 2008 primary mission. Further, the Mars Global Surveyor Mars Orbiter Camera (MOC), which had just demonstrated its capability to image MER Opportunity and Spirit rover hardware on the Martian surface in January and February of that year [Malin et al., 2010], was in its second extended mission. Would MOC (capable of ~1.4 m/pixel imaging), HiRISE (capable of ~25 cm/pixel imaging), or a similar high‐resolution camera be available when MSL was scheduled to land in 2010? The MSL MARDI investigation was proposed, in part, to guard against the possibility of a negative answer.

Thus, the primary objective for the MSL MARDI investigation was similar to that of the MPL MARDI—to obtain visual, spatial information about the geology and geomorphology of the MSL landing site so as to locate where the spacecraft landed and, in this case, to also assist with initial rover traverse planning. More specifically, the investigation was designed to address the following objectives:

  • 1

    Determine landing site location and geologic context. Investigation of rocks for records of habitable environments requires an understanding of the geologic context and stratigraphic position of the rocks investigated. These determinations are aided by knowledge of spatial relationships between materials of differing texture, structure, color, and tone. Determine the geographic and physiographic location and geologic context of the MSL rover, quickly (nominally, within 1 Mars day, a sol), after it has landed.

  • 2

    Assist initial traverse planning. Use the color image‐based knowledge of rover location, geologic context, and mobility system (wheels) hazards to plan the initial traverse for the MSL surface mission.

  • 3

    Connect geologic observations from images acquired by cameras aboard orbiting and landed platforms. Bridge the spatial resolution gap, providing continuity of geologic and geomorphic knowledge between images acquired by orbiting and landed cameras to improve science understanding of features barely resolved or invisible in the lower‐resolution orbiter images.

  • 4

    Observe dynamic events during descent. Deduce the wind profile encountered during the portion of the descent in which images are acquired, observe the descent and impact of the MSL heat shield, and document the interaction of the MSL terminal descent engines with the landing site surface.

  • 5

    Acquire data to for descent imaging technology demonstration. Acquire data to be used to evaluate future landing systems that might incorporate active hazard avoidance capabilities.

Because the investigation is centered on observations acquired during the descent of the MSL rover to the Martian surface, the MARDI design was not required to be operational and acquire in‐focus images after the landing sol (sol 0). However, after more than 2.3 Mars years of surface operations, the camera has remained functional and, thus, a sixth objective applies:

  • 6

    Observe geologic features within the MARDI field of view along the rover traverse. For however long the MARDI camera continues to be operational, acquire occasional images of geologic materials beneath the rover, within the camera field of view. MARDI has the potential to make stratigraphic observations over the course of a traverse upslope or downslope on a rock outcrop of varied lithology, observe eolian sediments and forms created by wind erosion of rock surfaces, and document regolith properties. As a portion of the left (port) front wheel is visible in landed MARDI images, images can also observe wheel interaction with surfaces during trenching activities.

3. Roles and Responsibilities

3.1. Instrument Development

The Mastcam, MAHLI, and MARDI science teams and Malin Space Science Systems (MSSS) developed these instruments under contract to NASA's Jet Propulsion Laboratory (JPL), a division of the California Institute of Technology. JPL manages the MSL Project for the NASA's Science Mission Directorate. At the suggestion of the NASA Selecting Authority, and for the ease of contracting, JPL combined the three science teams and development efforts into a single contract. JPL built the Curiosity rover and its cruise and descent stage delivery elements. MSSS developed, integrated, and tested the Mastcam and MARDI hardware and flight software. MSSS's major subcontracted partner for development of the Mastcam lens assembly mechanisms was Alliance Spacesystems, Inc. (ASI), of Pasadena, California. ASI is now MDA U.S. Systems, LLC, a division of MacDonald, Dettwiler, and Associates Ltd.

3.2. Assembly, Test, and Launch Operations

The MARDI camera head was delivered to JPL Assembly, Test, and Launch Operations (ATLO) facilities in July 2008 in order to be operated during all Entry, Descent, and Landing (EDL) tests that began in October 2008. The Mastcam hardware was delivered to JPL in March 2010. Working in partnership with MSSS engineers, ATLO tests involving use of the Mastcams and MARDI aboard Curiosity were conducted thereafter in JPL facilities and, starting in July 2011, at NASA's Kennedy Space Center (KSC) in Florida. Major tests involving the cameras included integrated camera geometric calibration in December 2010 [Maki et al., 2012; Bell et al., 2017] and rover system thermal/vacuum testing in a large environment chamber at JPL in March 2011 [Novak et al., 2012]. Final inspection and cleaning of the camera head optics by the MSSS team occurred in August 2011 at KSC.

3.3. Instrument Operations, Data Validation, and Data Archiving

The Mastcams and MARDI are operated in parallel with the MAHLI by a full‐time professional operations staff at MSSS in San Diego, California. Mastcam, MAHLI, and MARDI science team members, working part‐time at venues across the U.S., literally from Maine to Hawaii, assist with camera operations. The operations are conducted in partnership and close consultation with the full MSL science team and rover operations engineers at JPL. Together, these personnel decide which images to obtain and when to acquire them, all on a tactical (daily) basis. Their decision‐making process is based on considerations of scientific and engineering objectives relative to resources available (e.g., power, schedule, onboard data storage, and downlink data volume) on a given Martian day (sol). Section 8 describes operations in somewhat more detail.

MSSS personnel also provide Mastcam and MARDI data and instrument status information to the MSL Project and science team on a tactical (daily) basis. Further, the operations personnel monitor the amount of data stored on board each instrument and work with the Principal Investigator and the MSL science team to make decisions on when to delete data stored therein (deletion requires Principal Investigator authorization).

Working with the Principal Investigator, MSSS personnel also ensure data validation, and archiving with the NASA Planetary Data System (PDS). The validated data are placed in the NASA PDS archives on a regular basis per a schedule developed by MSL Project management. As of 05 December 2016, all validated Mastcam and MARDI data and data products received on Earth through sol 1417 (01 August 2016) were available in the archive.

4. Instrument Required Capabilities and Their Motivation

4.1. Mastcam

As proposed in July 2004, the Mastcam camera pair was envisioned to provide the primary geology and geomorphology imaging data set for the MSL mission. The required capabilities built upon operational and science lessons learned from previous landed Mars missions: Viking, Mars Pathfinder, and the Mars Exploration Rovers (MER) Spirit and Opportunity [Huck et al., 1975; Smith et al., 1997; Bell et al., 2003]. Of particular importance were the following requirements: spatial resolution and focus; true color, narrowband multispectral color, and stereo imaging and mosaics; data storage and downlink flexibility; and high‐definition video. Mosaics include any option from two overlapping images of adjacent targets to full 360° landscape panoramas. Camera pointing would be provided by the rover's Remote Sensing Mast (RSM) azimuth and elevation pointing mechanisms. In the following paragraphs, the phrase “The cameras shall” is omitted from the beginning of each subsection.

4.1.1. Spatial Resolution and Focus

Observe, in focus, geologic materials at ≤10 cm per pixel at 1 km distance and 150 μm per pixel at 2 m distance. The requirement at 1 km distance was motivated by geology, in particular to observe stratigraphic relations and rock outcrop erosional expressions at a spatial resolution higher than available from, and a viewing geometry different from, planned high‐resolution orbiter cameras (e.g., HiRISE); such images would also both assist in rover traverse planning (deciding where to go and investigate) and provide images of geology the rover would otherwise never visit (e.g., in a direction that differs from the planned traverse or on a slope too steep or high for driving).

The requirement at 2 m distance was motivated by the need to investigate and plan activities for the robotic arm workspace in front of the rover, roughly 2 m from the cameras mounted on the RSM; the 150 μm per pixel requirement permits detection of objects as small as ~500–600 μm (coarse sand) and covers the spatial resolution gap between Navigation Camera [Maki et al., 2012] and MAHLI [Edgett et al., 2012] capabilities in the workspace. Ultimately, the Mastcam spatial requirements drive a need for focusable cameras with a requirement to be in focus on objects in the robotic arm workspace, elsewhere on and near the rover, and out to infinity. Indeed, infinity focus facilitates viewing of geologic and geomorphic features beyond the 1 km, ≤10 cm per pixel range as well as viewing astronomical objects.

4.1.2. True Color Images and Mosaics

Obtain red, blue, and green (RGB) “true color” similar to that seen by the typical human eye in single‐frame images. The single‐image color capability minimizes image and mosaic acquisition time (and changes in illumination conditions over time) relative to cameras previously landed on Mars that employed separate red, green, and blue narrowband filters and acquired at least three (RGB) filtered images for each color scene. This requirement permits the color of geological materials to be directly relatable to human field and laboratory experience on Earth and was also designed to provide the public with a sense of what their eyes or their personal digital cameras might “see” if they were transported to Mars.

The main reason one might perform a color or white balance adjustment to an image is to accommodate illumination source “temperatures” (i.e., the light source hue as compared with the temperature of a blackbody radiator matching that hue). Consumer digital cameras, for example, commonly have built‐in capabilities that allow the user to adjust for differences in illumination conditions, such as a bright, sunny day, an overcast day, or indoor illumination by different types of artificial lamps. The solar illumination reaching the Martian surface, through its atmosphere and reflecting off its surface, and with a given opacity at a given moment in time, would differ from the terrestrial experience. One motivation for performing a color adjustment on Mastcam and MARDI images, then, is to ensure that the geologic features observed appear as they would to a human eye working with the materials if they were in a field setting on Earth.

4.1.3. Multispectral Images and Mosaics

Perform the same 13 narrowband filtering for color and near‐infrared spectroscopy (400–1100 nm) and solar observing (for atmospheric aerosol opacity) as were carefully selected and rationalized for the MER Pancam experiment [Bell et al., 2003]. This objective was enabled by the recognition that the out‐of‐band leakage of the Bayer filters was substantial (≤10%) and nearly uniform across the red and near‐infrared (IR) portion of the spectrum. Light passing through the science filters passes through the RGB true color filters of the Bayer filter array. In cases of red, green, and blue monochromatic visible light filters, only one of the Bayer filters would pass the light, so in some cases only a few pixels would record the light from those filters. In the cases of long‐red and near‐IR science filters, the detector receives equal amounts of light at each pixel. Finally, commonality with the Pancam filters was intended to facilitate comparison of results across three rover field sites, Spirit, Opportunity, and Curiosity.

4.1.4. Stereo Images and Mosaics

Acquire stereo imaging to provide both visual qualitative and computed photogrammetric quantitative three‐dimensional information about a scene. In the Mars rover science context, this is useful for interpreting distance, object size, and spatial relations of strata and geomorphic features. Ideally, stereoscopy is accomplished with cameras of the same focal length, as is the case on the MSL rover for the engineering cameras, that are specifically designed for stereo ranging [Maki et al., 2012]. When the Mastcam was descoped in 2007 (see section 5.1 below), both stereoscopy through identical focal lengths and focus capability were removed from the requirements. A nonidentical pair of cameras was deemed capable of satisfying interests to acquire stereo pair images while at the same time preserving both high‐ and low‐resolution options. Stereo observations between cameras of dissimilar focal length are limited by the scale of the shortest focal length. The requirement to mount the two Mastcam cameras with a maximum interoccular distance and if required, toe‐in to improve the stereo overlap closer to the rover was retained.

4.1.5. Data Storage and Downlink Flexibility

Store multiple narrow angle field‐of‐view mosaics in raw form for multiple image planning cycles, in both full resolution and thumbnail versions. Eventually, it became possible to generate thumbnails in realtime, so thumbnails were not stored in the camera buffers, but transmitted to the Rover's non‐volatile memory for storage. The Mastcam team envisioned maximizing flexibility to acquire, store, compress, and downlink images. For example, full 360° panorama mosaics could be acquired in RGB (single color image per mosaic frame, each stored as a gray scale image), small (thumbnail) versions of these images could be downlinked and mosaicked, and then the science team could choose which full‐size images (from one to all of them) to downlink, how to prioritize their downlink relative to other data on board the rover, and how to compress the data to optimize science return. This requirement, coupled with the MARDI descent imaging data storage requirements described in section 4.2, would drive decisions regarding data storage and compression options available on board the instrument.

4.1.6. High‐Definition Video

Employ video to observe dynamic events (e.g., eolian activity or astronomical events like the transit of Phobos across the face of the Sun) and provide options for public engagement opportunities. Thus, the Mastcam instruments were required to be able to acquire high‐definition video (720p; 1280 × 720 pixel format). The video requirement was, in part, a bonus provided by the commonality of design with MARDI, which required a descent video capability.

4.2. MARDI

To meet MARDI science objectives, the camera was required to be able to obtain and store a continuous series of several hundred nested RGB color images of the Martian surface of increasing spatial resolution during the descent to touchdown. It was further required to be fixed‐mounted on the rover body such that its optic axis pointed in the rover coordinate frame +Z direction (downward, the direction of the gravity vector when the rover is on a planar surface perpendicular to that vector), and with the wide dimension of the field of view perpendicular to the direction of flight (fore direction).

There were, in addition, a number of engineering objectives defined by the science team. For example, the descent imaging plan was required to begin shortly after parachute deployment but just prior to heat shield separation (in order to capture that engineering event).

Prakash et al. [2008] described the MSL Entry, Descent, and Landing (EDL) system and sequence of events; Kornfeld et al. [2014] discussed the EDL system and its verification and validation in further detail. The EDL system design was to have heat shield separation occur while the spacecraft was descending on its parachute. As the spacecraft further descended, its radar system would detect the surface and trigger parachute and backshell separation and a after a brief interval, initiate powered descent (firing engines) at an altitude between 1500 and 2000 m above the ground. Powered decent would continue until the rover was about 18.6 m above the surface and descending at a constant rate of 0.75 m/s, at which time the descent stage would release the rover on a set of cables. The system, now a descent stage and rover separated by these cables, would continue to move downward until the rover wheels touched the Martian surface. Then the cables would be cut and the descent stage would fly away and crash >150 m from the rover. MARDI imaging would be completed soon thereafter. A primary requirement on the MARDI was that imaging occur uninterrupted throughout this sequence of events.

Various aspects of the descent sequence affected the science requirements:

4.2.1. Nesting Coverage

Nesting scale impacts the ability to link geological observables in images at one scale with images at another. For descent imaging, nesting scale is a function of altitude, descent rate, and camera frame rate.

A primary requirement for the MARDI was to be able to acquire observations of the eventual landing location at several resolutions (e.g., from different altitudes), with later images “nested” inside earlier images. The descent stage's unknown but modeled large rigid body oscillatory motions while descending under its parachute required a large field of view. The angular rate of attitude variations required a short exposure time (<1 ms) to limit image motion smear to less than 1 pixel. The short exposure time led to the detector readout time being an appreciable fraction of the exposure time, leading to a large component of electronic shutter smear. The short exposure also led to a faster optics than would have been ideal.

4.2.1.1. Frame Rate

A slow fixed frame rate (as was required for the Mars Polar Lander MARDI [Malin et al., 2001]) results in a nesting scale ratio that varies during descent. For geologic and geomorphic observing, a 10:1 ratio is extremely difficult to use in comparisons from one image to the next but a ratio of 5:1 is much better. Thus, the MSL MARDI was required to obtain data over a short frame acquisition interval to ensure that the scale ratio at all points during the descent is less than 5:1. This translated to a requirement to obtain full‐frame MARDI images faster than 1 per second. The practical limit was set by the electronics clocking speed at a rate of ~5 frames per second, and the actual in‐flight frame rate was about 4 frames per second.

4.2.1.2. Field of View and Instantaneous Field of View (Resolution)

The rapid MSL descent was envisioned to make full nesting relatively easy to achieve, but solid body oscillations and rotations during parachute descent would reduce some of the image‐to‐image overlap, and actually eliminated it between images widely separated in time. Spatial coverage is achieved through the combined requirements for a 1648 by 1214 (1600 × 1200 active) pixel format detector inscribed in an ~90° circular field of view (FOV) lens (70 × 52° inscribed FOV), and minimal MSL hardware obstruction (except, of course, the heat shield) of the camera field of view during descent. A further, governing requirement on camera lens format was that, as the rover descends, at some point the entire landing site would be visible in an area ≥10 by 10 m at a scale of at least 1 mrad, or better than 2 cm per pixel (accomplished by vertical terminal descent).

4.2.2. Autonomous Operation

Finally, a most important requirement—a lesson learned from the previous MARDI instruments developed for the Mars Polar Lander [Malin et al., 2001] and Phoenix Mars Scout lander—was for the operation of the instrument to be as decoupled, as much as possible, from the mission critical operation of computers controlling the spacecraft descent. Thus, the MARDI was designed to operate autonomously: after an initial turn‐on and warm‐up period, a single command would start the imaging sequence, and it would run to completion during the descent, and after landing terminate without further spacecraft interaction.

5. Instrument Descriptions

5.1. Introduction

This section will describe the instruments as flown. These differ somewhat from the instruments originally proposed and developed, and the differences are often manifested in idiosyncratic attributes noted in the descriptions above and below. These differences resulted from the descope of both the Mastcam (initially required to remove both the zoom and focus capability, with focus capability later reintroduced into the flight design) and the MARDI (initially removed from the flight payload, but later reinstated although too late to complete a full calibration and meet a delivery date to participate in all EDL testing). The descopes in September 2007 came 4 months after instrument CDR, and some of the attributes resulted from fabrication of flight hardware based on the zoom lens design that could not be refabricated for the final flight design. Among the Mastcam idiosyncracies are the vignetting of the full field of view, flying dissimilar focal lengths and the effects this has on stereoscopy, and the limited near‐field focus capability of the M‐100, and for the MARDI the relatively poor flat field photometry and nonquantitative assessment of its stray and scattered light sensitivity.

5.2. Brief Overviews

Table 1 summarizes the characteristics of the Mastcam and MARDI instruments. Each of the three cameras has two hardware elements, a camera head and a digital electronics assembly (DEA). The two Mastcam camera heads are mounted on the rover's azimuth‐elevation articulated RSM [Warner et al., 2016]. The MARDI camera head is mounted, pointing downward, behind the front left rover wheel. The DEAs are packaged with the MAHLI DEA inside the rover's environment‐controlled body [Bhandari et al., 2005]. The Mastcams share a third hardware element, the Mastcam calibration target. The MARDI calibration target was affixed on the inside surface of the descent system heat shield. A strict definition of “in focus” is used for these cameras, wherein the optical blur circle is equal to or less than one pixel across.

Table 1.

Mastcam and MARDI Instrument Characteristics

Hardware Element Mass (kg) Power (w) Dimensions (cm)
Mastcam 34 mm 0.799 1.8 idle, 3.5 imaging 27 × 12 × 11
Mastcam 100 mm 0.806 1.8 idle, 3.5 imaging 27 × 12 × 11
MARDI 0.559 1.8 idle, 3.5 imaging 15 × 11 × 11
Digital electronics assembly (DEA) includes 2 Mastcams, MAHLI, and MARDI 2.089 6.4 idle, 9.0 imaging 22 × 12 × 10
Mastcam calibration target 0.079 None 8 × 8 × 6
MARDI calibration target NA None 34 × 93
Optics Description
Mastcam 34 mm Mastcam 100 mm MARDI
Field of view (FOV, diagonal) 20.6° 7.0° 90°
FOV (horizontal 1200 pixels)a 15.0° 5.1° 70°
FOV (vertical 1200 pixels) 15.0° 5.1° 52°
Instantaneous field of view (IFOV) 218 μrad 74 μrad 764 μrad
Focal ratio f/8 f/10 f/3.0
Effective focal length 34.0 mm 99.9 mm 9.7 mm
Focus range 0.5 m to infinity 1.6 m to infinity 2 m to infinity; ~1 mm resolution at surface
Band passes Figure 3 Figure 3 399–675 nm with Bayer pattern filter
Detector Description
Product Kodak KAI‐2020CM interline transfer CCD
Color Red, green, and blue microfilters; Bayer pattern
Array size 1640 × 1214 pixels, 1600 × 1200 photoactive pixels; typical image size was 1344 × 1200
Pixel size 7.4 μm (square pixels)
M‐34 gain, read noise, full well 16.0 e /DN 18.0 e 26,150 e
M‐100 gain, read noise, full well 15.8 e /DN 15.8 e 23,308 e
MARDI gain, read noise, full well 15.8 e /DN 17.5 e 29,454 e
Exposure Description
Duration 0 to 838.8 s; commanded in units of 0.1 ms
Auto‐exposure Based on Mars rover engineering camera approach of Maki et al. [2003]
Video acquisition Description
Modes Acquire and store single‐image frames or acquire and immediately compress into groups of pictures (GOPs)
Frame rate Maximum limited by signal transmission rate between camera head and DEA to <4 Hz
Compression Frames stored individually are treated the same as individual nonvideo images; GOPs are JPEG‐compressed
Onboard data compression options Description
Uncompressed No compression; no color interpolation
Lossless Approximately 1.7:1 compression; no color interpolation
Lossy JPEG; color interpolation or gray scale; commandable color subsampling Y:CR:CB (4:4:4 or 4:2:2); commandable compression quality (1–100)
Video GOP JPEG‐compressed color‐interpolated GOPs, up to 2 Mb file size and up to 16 frames per GOP; commandable color subsampling and compression quality
Deferred compression Image stored in onboard instrument DEA uncompressed; specified compression performed at a later time for transmission to Earth
Stereo pair acquisition
The two Mastcam camera heads can be used to acquire stereo image pairs, which requires resampling to match the pixel scales between the two cameras; the cameras have an interoccular distance of 24.64 cm and a toe‐in angle between the two Mastcams is 2.5°
The MARDI camera head is fixed‐mounted to the rover body; stereo pairs can only be acquired by motion of the rover—during descent and, after landing, by driving
Mosaic acquisition
Mastcam mosaic acquisition requires use of software planning tools and RSM motion to scan the camera head FOVs over the desired scene, typical overlap was 20% (top‐to‐bottom/side‐to‐side)
MARDI is fixed‐mounted to the rover body; mosaics are only acquired by motion of the rover—during descent and, after landing, by driving
Mastcam onboard focus merge Description (unavailable on MARDI)
Products Best focus image (color) and range map (gray scale)
Maximum input image size 1600 × 1200 pixels; input images must be same size
Maximum number of input images 8 images of 1600 × 1200 pixels
Input image compression Raw, uncompressed (no color interpolation)
Output image compression JPEG with user‐specified compression quality
Processing options Merge, registration (feature tracking + corner detection), blending (spline‐based)
Image thumbnails Description
Size Approximately 1/8th the size of the parent image; size in integer multiples of 16 rounded down (thus, the thumbnail for a 1600 × 1200 pixel parent image is 1600/8 = 200/16 = 12.5 ≥ 12 × 16 yields 192 and 1200/8 = 150/16 = 9.375 ≥ 9 × 16 = 144, so the thumbnail is 192 × 144 pixels)
Purpose Provides an index of images acquired by and stored onboard instrument; returned to Earth, often in advance of parent image, for evaluation and decision‐making about on‐Mars event success, data downlink compression, and priority
Compression Commandable compressed or uncompressed; typical thumbnails from Mars are 4:4:4 color JPEGs; commandable compression quality
Video GOP One thumbnail per GOP, derived from first image in GOP

Note that the horizontal FOV is not limited to 1200 active pixels. The full 1600 active pixels are illuminated (though vignetted), and the M‐100 FOV is 16/12 × 5.1° across (~6.8°), and the M‐34 is 16/12 X 15° = 20°.

5.2.1. Mastcams

After the 2007 descope of the zoom and focus capabilities, focus was added back to both Mastcams, because their depths of field would not cover the ranges from the rover deck to the horizon (~0.5 m to several kilometers) owing to their relatively fast optics. The MAHLI focus mechanism design was adapted to the Mastcams, but did not have sufficient capability to focus the M‐100 on the rover deck (thus, there are no in‐focus images of the Mastcam calibration target using the M‐100).

Curiosity's two Mastcams are mounted about 1.97 m above ground level, on the RSM camera bar (Figure 1). Mosaics and stereopair coverage are achieved by RSM azimuth and elevation pointing. The left camera, Mastcam‐34 (M‐34) or Mastcam‐Left (ML) (Figure 2a), is a fixed focal length, focusable, color, and multispectral camera. It has a ~34 mm focal length, f/8 lens that illuminates a 20.0 × 15° FOV, covering the (active) 1600 wide by 1200 high detector (although a small amount of corner vignetting occurs). The full fields of both cameras are vignetted because an earlier definition of the filter wheel and optics, optimized for the descoped zoom lens, was designed for a square field of view (FOV) inscribed on a 1200 by 1200 pixel subframe without vignetting, and there was insufficient time and money to redesign the filter wheels that had already been manufactured to the original design. The IFOV is 218 μrad that yields a pixel scale of 440 μm at 2 m distance and 22 cm at 1 km.

Figure 2.

Figure 2

Mastcam and MARDI flight hardware elements with pocketknife (88.9 mm long) for scale. (a) Mastcam‐34 camera head. (b) Mastcam‐100 camera head. (c) MARDI camera head. (d) Mastcam, MARDI, and MAHLI digital electronics assembly (DEA); each instrument DEA is a separate entity (they do not communicate with each other) packaged together to reduce volume and mass.

Mastcam‐100 (M‐100) (Figure 2b), also known as Mastcam‐Right (MR), is also a fixed focal length, focusable, color and multispectral camera. The camera has ~100 mm focal length (that changes very slightly depending on focus), f/10 lens that illuminates a 6.8° × 5.1° FOV on the (active) 1600 × 1200 pixel detector (again with slight vignetting). The IFOV is 74 μrad that yields a pixel scale of 7.4 cm at 1 km distance and ~150 μm at 2 m distance. Both cameras use an RGB Bayer pattern color filter array with microlenses integrated with the charge‐coupled device (CCD) detector and have options for imaging a scene via this RBG detector through a broadband filter or imaging through one of the seven additional (per camera) narrowband filters, arranged on a filter wheel, in the 399–1100 nm range (Figure 3). The Bayer filter array has a RG/GB 4 pixel unit cell.

Figure 3.

Figure 3

Mastcam narrowband filters. (a) Photograph of white light transmitted through the Mastcam‐34 (Mastcam‐Left) filter wheel. Labels indicate the filter positions, left zero through seven (L0–L7), and effective center wavelength as determined by Bell et al. [2017]. (b) Photograph of white light transmitted through the Mastcam‐100 (Mastcam‐Right) filter wheel. Labels indicate the filter positions, right zero through seven (R0–R7), and effective center wavelength as determined by Bell et al. [2017]. Filter aperture diameters are 12.6 mm diameter. (c) Mastcam narrowband filter transmissions and CCD Bayer pattern filter (BPF) response functions. Note that the three BPF dashed lines extend longward of 700 nm wavelength, indicating leakage, and that they converge longward of 850 nm, meaning that they are equally transmissive at longer wavelengths, which permits imaging at these scientifically interesting wavelengths.

5.2.2. MARDI

MARDI (Figure 2c) is a fixed focal length, fixed‐focus, color camera with a 90° circular FOV lens that yields a 70° by 52° view on a 1600 by 1200 (active) pixel RGB Bayer pattern‐microfiltered CCD over a band pass of 399–675 nm. This wide‐angle lens shows a distinct brightness falloff from the center of the field to the edges that can be corrected using calibration flat‐field images.

The lens f/number is f/3.0, this being the slowest practical system that met a requirement to provide sufficient light gathering capability to get a signal‐to‐noise ratio (SNR) of greater than 50:1, even with unfavorable Sun elevation and surface albedo (both dependent on landing site that was unknown at the time of design). The f/3 was also the fastest practical aperture stop, because the exposure had to be set to a single fixed value for the entire descent imaging sequence and exposure was a trade‐off between the desire to minimize blurring from solid body motion during descent and the desire to improve SNR and reduce interline transfer smear (the latter becoming more of an issue with shorter exposure time). The exposure duration also had to provide margin against sensor saturation from high‐albedo, low‐phase angle surfaces in the scene. After extensive deliberation and testing under Earth sunlit conditions using the MARDI engineering model hardware, we settled on an exposure time of 0.9 ms for the decent. Based on the descent design [Prakash et al., 2008], this was anticipated to result in acquisition of 1000–1500 frames, well within the capacity of the MARDI onboard data storage that was designed to hold about 4000 raw full‐frame 8 bits per pixel images.

The MARDI IFOV is 0.768 mrad, which provides in‐focus pixel scales that range from ~1.5 m at 2 km range to ~1.5 mm at 2 m range, covering surface areas between 2.4 by 1.8 km and 2.4 by 1.8 m, respectively. At distances <2 m, out‐of‐focus blurring increases proportionally as the spatial scale decreases, resulting in a constant spatial sampling of ~1 mm. Once landed, and at a range to the surface of 66 cm (to front of sun shade; the focal plane is about 77 cm above the surface), MARDI images have a pixel scale of ~0.5 mm per pixel over a ~80 by ~60 cm surface, and the out‐of‐focus resolution is about three pixels (1.5 mm). Interpolative resampling reduction of MARDI images by a factor of 2 provides a good approximation to the resolution of the system.

MARDI was provided with an 11.8 W survival heater to maintain its allowable flight temperature (−40 to +40°C) during the mission's cruise phase, if needed. This heater was also used prior to EDL to raise MARDI's temperature to ensure that it would remain within its flight allowable range during the cooling predicted to occur throughout EDL. Since MARDI had no requirement to operate during the landed mission, the heater is not sized for use on the surface. Owing to its descope history, MARDI did not go through the same thermal environmental qualification testing as the other cameras (that nominally includes testing in the expected surface thermal environment), which raised questions about its survivability without heat on the surface. However, since landing, although the MARDI camera head reaches temperatures well below its tested allowable flight temperature during diurnal cycling, both its electronics and optics were built to the same design rules as Mastcam and MAHLI, and it remains fully operational more than 2.3 Mars years after landing.

5.3. Camera Heads

Each Mastcam and MARDI camera head (Figures 2a–2c) consists of a lens assembly, a focal plane assembly, and a camera head electronics assembly. The latter two share a common design with each other and the MAHLI. The Mastcam lenses are focusable and each Mastcam camera head also includes an eight‐position, rotatable filter wheel.

5.3.1. Focal Plane Assembly and Electronics

Each camera's focal plane assembly is designed around a Kodak (now ON Semiconductor) KAI‐2020CM charge‐coupled device (CCD) without a cover glass (Figure 4, left). The sensor has 1648 by 1214 pixels of 7.4 μm by 7.4 μm size. The photoactive area is a 1600 by 1200 pixel area inside of 16 dark columns and four photoactive buffer pixel columns on either side of the detector, two dark rows at the top and four dark rows at the bottom, plus four photoactive buffer pixel rows at both the top and bottom of the detector. The CCD uses interline transfer to implement electronic shuttering. The sensor has red, green, and blue (RGB) filtered microlenses arranged in a RG/GB Bayer pattern. Microlenses improve detector quantum efficiency, which is about 40% on average for the three color channels. The “fast‐dump” capability of the sensor is used to clear residual charge prior to integration and also allows vertical subframing of the final image.

Figure 4.

Figure 4

Mastcam and MARDI electronics block diagram. (top left) The elements common to the Mastcam camera heads but not available in MARDI are shaded dark gray. MC is Mastcam, FPGA is a field‐programmable gate array, LVDS is low‐voltage differential signal, an ADC is an analog to digital converter, the Kodak KAI‐2020CM is a charge‐coupled device (CCD), NAND is a type of nonvolatile (flash) memory, EPROM is an erasable programmable read‐only memory, Gb is gigabyte, Mb is megabyte, DC‐DC converters change a source of direct current from one voltage level to another, SDRAM is synchronous dynamic random access memory, I/F stands for interface, V is voltage, and S/C represents spacecraft, which in this case is the rover.

The output signal from the CCD is AC‐coupled and then amplified. The amplified signal is digitized to 12 bits at a maximum rate of 10 megapixels per second. For each pixel, both reset and video levels are digitized and then subtracted in the digital domain to perform correlated double sampling (CDS), resulting in a typical 11 bits of dynamic range and 1 bit of noise.

The camera head electronics are laid out as a single rigid‐flex printed circuit board (PCB) with three rigid sections. The sections are sandwiched between housings that provide mechanical support and radiation shielding; the interconnecting flexible cables are enclosed in metal covers for radio‐frequency shielding. Camera head functions are supervised by a single Actel RTSX field‐programmable gate array (FPGA). In response to commands from the DEA (Figures 2d and 4, right), the FPGA generates the CCD clocks, reads samples from the analog‐to‐digital converter (ADC) and performs digital CDS, and transmits the pixels to the DEA. The FPGA is also responsible for operating the motors that drive the Mastcam focus and filter wheel rotation mechanisms.

The camera heads operate using regulated 5 V and ±15 V power provided by the DEA. A platinum resistance thermometer (PRT) on the camera focal plane provides temperature knowledge for radiometric calibration. An etched‐foil heater and an additional pair of PRTs are attached to the outside of the camera head and thermostatically controlled by the rover to warm the mechanism for operation when needed. On Mars, the camera head is usually operated at temperatures between −40°C and +40°C and was verified (through testing on a nonflight unit) to be able to survive nearly 3 Mars years of diurnal temperature cycles (down to −130°C) without any heating.

5.3.2. MARDI Lens Assembly

The fixed‐focus MARDI lens has an effective focal length of 9.68 mm, a 90° circular FOV, and its front element has a diameter of 40.0 mm. It has a focal ratio of 3.0 and a distortion of −18.6%. The 74.4 mm long aluminum lens housing contains seven powered elements and a plane‐parallel optical filter (Figure 5c). The filter limits the spectral band pass to 399–675 nm. Ghaemi [2009, 2011] provided details regarding the lens design and fabrication challenges. The MARDI 90° circular FOV, projected onto the 1600 by 1200 CCD, provides an overall 70° by 52° view.

Figure 5.

Figure 5

(a and b) Mastcam and (c) MARDI optics and ray‐trace diagrams (not to scale).

5.3.3. Mastcam Optomechanical Lens Assembly

The Mastcam fixed focal length optomechanical lens assemblies include integrated optics and focus mechanisms and a single drive motor to adjust focus (Figure 6). DiBiase and Laramee [2009] described aspects of the optomechanical design and manufacture; Ghaemi [2009, 2011] described the optics and their assembly.

Figure 6.

Figure 6

Representative cross‐sectional views of a Mastcam camera head. Shown here is the Mastcam‐100. The top view shows the entire camera head including its sunshade baffle; the lower view shows the details of the optical and mechanical lens and filter wheel assemblies.

The optics for each Mastcam is an all‐refractive focusable design, Mastcam‐34 has six fixed elements and three in a single movable group, Mastcam‐100 has five fixed and three in a single movable group, and the front of each also has a protective plano‐plano sapphire window (Figures 5a and 5b). Attached to the front of each camera is an aluminum sunshade that comprises nearly two thirds of the length of the cameras. This provides stray light rejection, protection against dust accumulation on the front window of each camera, and structure for mounting the front of the cameras to the RSM (the size and shape were dictated by the design of the descoped zoom lens).

The 34 mm lens has a focal ratio of about f/8 and a field of view of about 20.6°. Its angular resolution is about 218 μrad and it can focus on targets at distances of about 0.37 m to infinity. The focal length varies from 34.2 mm at infinity to 34.0 mm at closest focus. The 100 mm lens has a focal ratio of about f/10 and a field of view of about 7.0°. Its angular resolution is about 74 μrad and it can focus on targets at distances of about 1.8 m to infinity. The focal length varies from 99.9 mm at infinity to 94.8 mm at closest focus.

Mechanically, the Mastcams are similar to MAHLI, but with the omission of the MAHLI dust cover mechanism. The lens focus group is positioned using an Aeroflex 10 mm stepper motor with a 256:1 gearbox. The motor drives the integral gear of a cam tube, and the focus group slides on linear bearings under the control of a cam follower with rolling bearing. End‐of‐travel sensing is performed using a Hall‐effect sensor and magnet pairs on the cam. The total useful range of travel of the focus group is about 2 mm. To protect against damage during the vibratory loads of launch, EDL, and driving, the mechanism has a passive launch restraint at one end of its range of travel that loosely captures the cam follower.

The mechanisms are lubricated with polytetrafluoroethylene (PTFE)‐based vacuum‐compatible grease. This grease suffers from viscosity increase at low temperatures, requiring heating to be applied at some times of day for reliable actuation. This is accomplished using an etched‐foil heater wrapped around the lens barrel, thermostatically controlled by the rover. The minimum allowable flight temperature of the mechanism is −40°C, although function has been demonstrated at considerably colder temperatures.

5.3.4. Mastcam Filter Wheel Assemblies

Each Mastcam has an eight‐position filter wheel actuated by an Aeroflex 10 mm stepper motor with a 16:1 gearbox. DiBiase et al. [2012] described the mechanical drive system. The motor drives an integral gear on the outer diameter of the filter wheel; the filter wheel is supported by a bearing on its inner diameter. Position sensing of the wheel is accomplished using a Hall‐effect sensor and magnet pairs at each filter location, with an extra magnet pair for absolute indexing. Each filter has an aperture of 12.6 mm diameter.

Figure 3 shows the arrangement of filters for the Mastcam‐34 and Mastcam‐100 camera heads. In each, one of the eight positions contains a “clear” broadband infrared cutoff filter that is used with the Bayer RGB color capability of the CCD; this is filter position 0. At position 7, each camera has a 10−5 neutral density filter for solar observations (Mastcam‐34, ~880 nm; Mastcam‐100, ~440 nm). Figure 3 also shows the other visible and near‐infrared narrowband filters and their response functions; the spectral bandwidths are from system‐level monochromator measurements described by Bell et al. [2017]. Note that filter position 0, the infrared cutoff filter, is designed such that the cameras provide “natural color” in a single exposure.

Because the Bayer pattern filters are effectively transparent over the ~800 to ~1050 nm range, the IR filters can be used without regard to the pattern. For those filters that overlap the Bayer pattern wavelengths, the Bayer color interpolation is adjusted to either scale or ignore the pixel positions outside the filter's band pass. Thus, the images can appear to be gray scale or the principal color of the filter.

5.4. Digital Electronics Assembly (DEA)

5.4.1. DEA Hardware

The Mastcam and MARDI DEAs (Figures 2d and 4, right) are packaged with the MAHLI DEA to save mass. In each DEA, the electronics are laid out on a single rectangular PCB. The DEA interfaces the camera head with the rover avionics. All data interfaces are synchronous (dedicated clock and sync signals) and use low‐voltage differential signaling. Each (redundant) rover interface comprises two flow‐controlled serial links, one running at 2 megabits per second from the rover to the DEA and another at 8 megabits per second from the DEA to the rover. The DEA transmits commands to its respective camera head using a 2 megabits per second serial link and receives image data from the camera head on a 30/60/120 megabits per second selectable rate 6‐bit parallel link. The DEA is powered from the rover's 28 V power bus and provides switched regulated power to the camera head. It also contains a PRT for temperature monitoring.

The core functionality of the DEA is implemented in a Xilinx Virtex‐II FPGA. All interface, compression, and timing functions are implemented as logic peripherals of a Microblaze soft‐processor core in the FPGA. The DEA provides an image‐processing pipeline that includes 12‐to‐8‐bit companding of input pixels, horizontal subframing, and optionally lossless predictive or lossy Joint Photographic Experts Group (JPEG) compression. The latter also requires the Bayer pattern raw image to be interpolated and reordered into luminance/chrominance block format. The onboard image‐processing pipeline can run at the full speed of camera head input, writing the processed data stream directly into DEA memory.

The DEA memory subsystem contains 128 Mb of SDRAM (synchronous dynamic random‐access memory) and 8 Gb of nonvolatile NAND flash memory. The flash is organized as a large image buffer, allowing images to be acquired without use of rover memory resources at the maximum camera data rate. The SDRAM is typically used as scratchpad space and to store file system information, but can also be used as a small image buffer.

5.4.2. Flight Software

DEA hardware functions are coordinated by the DEA flight software (FSW) that runs on the Microblaze. The FSW receives and executes commands, transmits commands generated on Earth from the rover, and transmits any resulting data. The FSW also implements autofocus and auto‐exposure algorithms for image acquisition, performs error correction on the contents of flash memory, implements mechanism control and fault protection, and performs focus stack merges. The FSW consists of about 10,000 lines of ANSI C code (American National Standards Institute C programming language).

The FSW allows considerable flexibility for image acquisition. Images can be acquired in uncompressed form into the flash buffer and read out multiple times using different compression schemes. Alternatively, images can be compressed during acquisition to minimize the amount of buffer space they occupy; the latter capability is useful for video sequences. Reduced resolution “thumbnail” versions of each image can be generated on command. These thumbnails are generated in the FSW with some assistance from the DEA logic; in the case of images that are stored in compressed form, partial software decompression is required to produce the thumbnail. Thumbnails are produced “on‐the‐fly” just prior to their delivery to the rover compute element and queued for downlink; they are not stored by the cameras.

5.4.3. Postlanding Flight Software Modifications

Since arrival at Mars, very few issues have arisen regarding the Mastcam and MARDI flight software and all of those that did arise are managed with operational workarounds. Therefore, the Mastcams are still running on Mars using the same version of the software with which they were launched. MARDI, however, is using a new version of the FSW that adds a requested capability for so‐called “sidewalk mode” imaging. This operational mode, installed on sols 628–629 (13–14 May 2014), yields mosaics of MARDI images that parallel a rover drive. Driving on a layered rock exposure, these data can be used to continuously map stratigraphic relations at subcentimeter scales.

Normally, when the rover is driving it moves a short distance and then stops to evaluate its position. MARDI videos acquired during such drives would contain many redundant frames acquired during the pauses, but there is no way to command the video to start and stop based on rover behavior. For sidewalk mode, images are acquired, stored in MARDI SDRAM, and each is compared to the previous image using a simple mean‐squared error metric. If the error exceeds a fixed value, the image is considered to contain new scene information and is stored in the DEA flash storage. This method discards images acquired during drive pauses. The additional computation limits the maximum frame rate to about 1 frame every 3 s, but this usually provides good overlap between adjacent frames. Generally, the image thumbnails (section 7.9) are downlinked first and science investigators select which full frames to downlink.

5.5. Calibration Targets

5.5.1. MARDI

Serving as a white balance target during the MSL rover's descent to the Martian surface, the MARDI calibration target (Figure 7b) was a swatch of beta cloth (PTFE‐coated fiberglass cloth; Bron Aerotech BA 500BC). The target is a truncated 60° pie wedge‐shaped piece with a base of 34 cm and a side of 93 cm. Prelaunch calibration included imaging of a sample of target material with a standard color reference under close to identical illumination conditions.

Figure 7.

Figure 7

MARDI hardware elements during final spacecraft assembly in October 2011 at the NASA Kennedy Space Center Payload Hazardous Servicing Facility in Florida, USA. (a) Illustrating the unobstructed view that MARDI would have for most of the descent, this photograph shows that the camera head after the rover was packaged with the MSL descent stage inside the spacecraft backshell. The inset shows the close‐up view of the camera head. NASA photograph KSC‐2011‐7344. (b) MARDI calibration target mounted inside heat shield. NASA photograph KSC‐2011‐7307.

5.5.2. Mastcam

The pair of Mastcams share an 8 by 8 by 6 cm calibration target that provides a suite of reference color and gray swatches of known spectral reflectance and photometric properties (Figure 8). The target provides opportunities for validation of the instrument's radiometric calibration and tactical calibration of radiance images to relative reflectance or Lambert albedo in the Martian environment. The shadow cast by a central post (gnomon) accommodates assessment of the direct versus diffuse components of solar and sky irradiance incident on the target.

Figure 8.

Figure 8

The Mastcam calibration target on board Curiosity as viewed from three different cameras. The target is a flight spare of the MER Pancam calibration target, described by Bell et al. [2003] to which ring‐shaped “sweep” magnets were added to provide dust‐free areas on the color and gray swatches [see Bell et al., 2017]. (a) Details on the target, including initial sand and dust clinging to the embedded ring magnets, are evident in this sol 14 Mastcam‐34 view (from image 014ML0000210000100207E01). (b) The calibration target as viewed from a different camera position by the MSL MAHLI on sol 544 (portion of image 0544MH0003510000201493C00). (c) For comparison with the scale and focus of the view to the right in Figure 8d, this is the Mastcam‐34 view of the calibration target on sol 14 (image 014ML0000210000100207E01). (d) This is the Mastcam‐100 view on the same sol (image 0014MR0000210020100071E01). The target is too close to the Mastcam‐100 camera head for the camera to focus on it; nevertheless, sufficient pixels cover the red and gray swatches so as to be useful for in‐flight calibration [Bell et al., 2017].

The Mastcam calibration target is a modified flight spare from the MER Pancam investigation. The two preceding flight unit calibration targets were aboard the rovers Spirit and Opportunity [Bell et al., 2003; Bell et al., 2006]. The flight spare was modified to add six ring‐shaped “sweep” magnets [Madsen et al., 2003] approximately 1 mm under the surface of the color swatches and two of the gray scale rings to enable small parts of the target to remain relatively dust‐free throughout the surface mission; a similar strategy was employed for the Phoenix Mars lander Surface Stereo Imager [Leer et al., 2008]. Details of the reflectance and photometric properties of the target materials were described previously by Bell et al. [2003] and Bell et al. [2006]. Operationally, calibration target observations through all filters are acquired either before or after all multispectral and photometric sequences, and on a few other occasions, or roughly 5% of the Mastcam sequences transmitted to the rover.

6. Instrument Accommodation

6.1. MARDI Camera Head

The MARDI camera head is mounted rigidly on the bottom front left (fore port) side of the Curiosity rover with the top of the sunshade aligned evenly with the bottom of the rover chassis (Figure 7a). Its optic axis points in the rover coordinate frame +Z axis (toward the ground), and the long dimension of the detector is oriented perpendicular to the direction of flight (in practice, the FOV of the camera is inverted relative to the normal video scan direction). This position was designed to provide a view of the Martian surface, during descent, that is unobstructed by hardware except the descending heat shield and a portion of the front left rover wheel that would drop into view during the rover's cable descent from the hovering “sky crane” descent stage [Prakash et al., 2008]. The shutter speed was selected to address the effects of parachute‐induced solid body rotation rates (large magnitude but moderately low frequency) and descent‐thruster‐induced vibration (small magnitude but higher frequency).

6.2. Mastcam Camera Heads and Pointing

The Mastcam camera heads are mounted on the “camera plate” at the top of Curiosity's RSM, along with the ChemCam mast unit [Maurice et al., 2012] and the rover's four (two primary and two spare) Navigation cameras [Maki et al., 2012]. Each camera head is bolted to the RSM camera plate via two mounting feet on the upper housing/filter wheel assembly, and at a “bipod” support near the front of the camera's sunshade. Viewed from the back, the 34 mm camera is mounted on the left and the 100 mm camera is mounted on the right, and the terms “left eye” and “right eye” or “Mastcam left” and “Mastcam right” or “ML” and “MR” are commonly used in the course of Mastcam operations. The relative position and orientation of the cameras as measured during calibration is included in the camera models in the archival (NASA PDS) data product labels.

The RSM provides pointing in azimuth and elevation relative to the rover coordinate system. The RSM can point the Mastcams +91° (up) to −87° (down) in elevation; azimuth pointing permits full 360° views. Because of the orientation of the Mastcams, the pointing directly corresponds to the raw image coordinate system, such that azimuth moves a target left and right in image space and elevation moves it up and down. Mastcam imposed a pointing error requirement of 1/10 of the Mastcam‐100 FOV or about 0.5°; in practice, the control is better and pointing stability during exposures has not been an issue. Operationally, Mastcam pointing is commanded in a variety of coordinate systems, both rover‐fixed and relative to the surface. Pointing can be specified either angularly or as rectangular coordinates, whichever is more appropriate to the imaging use.

An idiosyncrasy of the Curiosity deck layout and the placement and orientation of the RSM coordinate system is that the Mastcam calibration target is positioned on the rover such that the RSM hard stop is between the optimal pointing angle for the left and right Mastcams. Thus, to view the target with both cameras, the RSM has to travel almost a full 360° between images, a minor operational inefficiency.

A flex cable connects each Mastcam and its DEA. These provide power to each camera head and form the communications link between them. These cables pass through azimuth and elevation twist capsules, which account for the relatively long total length of the flex cable. This design simplifies cable management, and is why the RSM cannot point below −87° elevation.

6.3. DEAs and Camera Head to DEA Harness

The DEA package—containing DEAs for each of the four cameras: Mastcam‐34, Mastcam‐100, MARDI, and MAHLI—is located inside the rover's environmentally controlled body (the warm electronics box). Each Mastcam camera head is connected to its respective DEA by a multisegment flex and round‐wire cable, approximately 6 m in total length, that is integrated with the rover's RSM. MARDI uses a conventional round‐wire cable harness also provided by JPL. Unlike MAHLI, which is limited in its maximum data rate by the longer harness on the rover's robotic arm [Edgett et al., 2012], neither Mastcam nor MARDI have such limitations and can run at the full signaling rate of 20 MHz.

6.4. Calibration Targets

The MARDI white balance target was mounted on the interior surface of the MSL heat shield (Figure 7b) so that it would be approximately centered in the MARDI FOV at heat shield release.

The Mastcam calibration target was required to be visible to and no closer than 1 m from the Mastcams. In addition, the science team desired the calibration target to be located such that it would typically be fully illuminated by the Sun (e.g., no shadows from hardware) between at least 10:00 and 14:00 local solar time on Mars. There was also a desire to mount the target at a location on the rover at which it would be viewed at an emission angle similar to or the same as the Pancams views of their calibration targets on the MER rovers (camera pointing elevation −36.5° ± 2.5° relative to a horizontal elevation of 0°), so as to take advantage of the detailed characterization work [Bell et al., 2003; Bell et al., 2006] already performed.

Ultimately, the Mastcam calibration target was mounted on the top of the Rover Pyro Firing Assembly box at the right rear of the rover deck (Figure 8). As described by Bell et al. [2017], pointing the Mastcam‐34 at the calibration target means positioning it at rover‐frame azimuth 189.2°, elevation −32.4°; for the Mastcam‐100, the elevation is the same and the azimuth is 176.2°. The distance between the center of the base of the calibration target gnomon to each Mastcam sapphire window is about 1.2 m; as Bell et al. [2017] discuss, this is too close for acquisition of in‐focus Mastcam‐100 images, but the greater number of Mastcam‐100 pixels (relative to Mastcam‐34 pixels) covering the target calibration surfaces compensates for the blurred view and does not compromise use of the target as a relative calibration source for Mastcam‐100 data [Bell et al., 2017].

7. Instrument Capabilities and Onboard Data Processing

7.1. Full‐Frame and Subframe Imaging

Mastcam and MARDI image size—the number of rows and columns of pixels—is commanded in whole number multiples of 16. Full‐frame images are 1600 by 1200 pixels in size or larger (the detector is 1640 by 1214 pixels in size; because of multiples of 16, the maximum number of columns is 1648 pixels). Subframes, when desired, are commanded at the time of image acquisition. Mastcam full‐frame images are vignetted, so to avoid the corner darkening; subframes, often operationally called ”full frames,” are typically 1344 × 1200 pixels, as these permit fewer images in a mosaic. An alternative Mastcam mosaic format, 1200 × 1200 pixels, was used early in the mission. Mastcam solar tau measurements are typically subframed 256 × 256 pixels, and some engineering images of the drill bit and sample portioner tube are also occasionally subframed 256 × 256 pixels.

7.2. MARDI Descent Operation

Operation of MARDI was required to be decoupled and occur independently of the rover during EDL. Normally, the MSL science instrument command process requires a “handshake” from the instrument to the rover. To eliminate any chance of a MARDI problem disrupting EDL events, MARDI was operated in a special “quiet” mode that suppressed the handshake. The rover simply commanded MARDI to begin imaging and the rest occurred internal to the MARDI. Each command was sent twice. The EDL image sequence was commanded as 1504 frames at the maximum frame rate, resulting in a video duration that would be longer than any possible descent time. This command was designed to complete normally instead of being terminated at landing; thus, some number of images were intended to be acquired of the surface beneath the camera after touchdown. A complicated downlink plan was devised to sample the full sequence because uplink commanding of the science instruments was not intended to begin until almost a week after landing. Thus, many images were returned out of order and sampled from the entire descent image data set, and not necessarily the best frames first, based on the a priori worst guess landing sequence. The actual landing took less than half the time of the worst case.

7.3. Mastcam Focus

7.3.1. Manual and Previous Focus

Each Mastcam can be focused “manually” or it can be autofocused. A manual focus is one in which the lens focus group is commanded to move to a specified motor count position and acquire an image. The cameras can also be commanded to use the same focus setting as the immediately previous image (i.e., a command to not change focus before acquiring the next image).

Bell et al. [2017] found that the relationship between focus group stepper motor count and distance to a target is

Mastcam34:d=363.64/2427.50f34 (1)
Mastcam100:d=3322.3/3491.92.58Tf100 (2)

in which d is the distance from the camera to the focused image target in meters, T is the temperature of the camera in degrees Celsius, and f 34 and f 100 are the focus motor counts reported by the Mastcam‐34 and Mastcam‐100 cameras, respectively.

7.3.2. Autofocus

The Mastcam autofocus command instructs the camera focus stepper motor to move the lens focus group to an estimated specified starting position closer to the camera than the actual focus position (i.e., far‐field out of focus) and collect an image, move a specified number of steps out from that position and collect another image, and keep doing so until reaching a commanded total number of images, each separated by a specified motor count increment. Each of these images is JPEG compressed (Joint Photographic Experts Group; see CCITT [1993]) with the same compression quality factor applied. The file size of each compressed image is a measure of scene detail that is in turn a function of focus (an in‐focus image shows more detail than a blurry, out‐of‐focus view of the same scene). In practice, assuming that the seed position was on the short side of focus, and the final position on the farside of focus, the file sizes would start low, increase to a near maximum, and then decrease. As illustrated by Edgett et al. [2012], the camera determines the relationship between JPEG file size and motor count and fits a parabola to the three neighboring maximum file sizes. The vertex of the parabola provides a computation of the best focus motor count position. Having made this determination, the focus stepper motor moves the lens focus group to that best motor position and acquires an image. This last image is stored, and the earlier images used to determine the autofocus position are not.

Autofocus can be performed over the entire Mastcam field of view, or it can be performed on a subframe that corresponds to the portion of the scene that includes the object(s) intended to be in focus. It should be noted that this technique is not perfect, as a number of factors associated with the scene itself can spoof the approach (e.g., half of the scene is very active, while half is not; the scene is uniformly active but covers a distance greater than the depth of field of the lens). Also, success is highly dependent on the seed estimate and number and size of increments (e.g., if the seed is too close and number and increments does not pass focus, or in the opposite case, the seed is too far past focus and so increments never achieve focus).

7.4. Mastcam Filter Wheel

Each Mastcam filter wheel (Figures 3a and 3b) position can be commanded either with an explicit, separate command to rotate to a specific position or implicitly by commanding a filter position when commanding image acquisition. The filter wheel gear ratio is such that it takes 2346.67 stepper motor counts to rotate the wheel by 360° or 293.33 motor counts to rotate from one filter position to the next. Since it is impossible to move by fractional motor counts, this is rounded up to 294 counts. For a single revolution, this leads to a negligible amount of positioning error, but if left to accumulate, then the mismatch would eventually lead to noticeable misalignment of the filters. To avoid this, the wheel is only commanded over an total angle of 315°; that is, if the wheel is at position 7 and position 0 is desired, the wheel is driven seven positions backward instead of one position forward, even though the latter would be faster. The motor is typically driven at a rate of 150 motor counts per second, so each filter step takes slightly less than 2 s to accomplish. The magnet pair at each filter position could be used for positioning drive control. However, in practice, filter wheel operation has always been skip‐free, so positioning is done using dead reckoning. A homing algorithm is implemented in the flight software that can move the wheel to the filter 0 location without any knowledge of its initial position. Periodic filter homing operations can be performed if there is reason to suspect filter wheel slippage.

7.5. Exposure

The exposure duration for each Mastcam or MARDI image can be commanded manually or by using the instrument's auto‐exposure capability. The exposure duration can range from 0 to 838.8 s (i.e., up to 13.98 min). Exposure times are commanded in units of 0.1 ms and manual exposures require the user to know something about the expected illumination conditions.

The auto‐exposure algorithm is derived from the implementation of Maki et al. [2003] for the MER engineering cameras. To command an auto‐exposure, users specify a target maximum digital number (DN; the pixel value), the percentage of pixels permitted to exceed that DN, the number of iterations to be performed to close‐in on the auto‐exposure value, and an early termination percentage (that specifies the matching threshold between DN desired and DN measured; if the two values are within the termination percentage of each other, the algorithm terminates early, and that image is accepted for downlink). The user also commands the starting exposure duration. The auto‐exposure algorithm causes the camera to acquire up to the specified number of images, starting at the commanded exposure time and adjusting the time up or down to try to have no more than the specified percentage of the pixels exceed the target maximum DN. The auto‐exposure process is best used under well‐illuminated conditions; running auto‐exposure will terminate when the exposure time exceeds 10 s in order to limit the period during which the camera cannot take action to abort its activity if necessary. The early termination parameter saves time because it specifies the point at which the automated process has arrived close enough to the exposure goal.

7.6. Video

Both Mastcams and MARDI can acquire video frames of 720p high‐definition format (1280 by 720 pixels); they can also run video commands that acquire larger or smaller format frames. For 720p format video, the maximum frame rate is between 5.9 and 7.7 frames per second, depending on the amount of interframe sensor flushing in use. For a full frame, the maximum frame rate is between 3.9 and 4.7 frames per second. Video commanding involves specification of the number of images (video frames) to be obtained, a time period to elapse between the acquisition of each frame, and the pixel dimensions and location of the starting pixel for subframing (if desired).

7.7. Image Compression

7.7.1. Acquisition and Compression

Acquired as 12‐bit images, Mastcam and MARDI data are converted on board the instrument to 8‐bit images using a square‐root companding look‐up table. This table preserves the original data's signal to noise (which scales with the square root of signal), and through this encoding preserves the dynamic range. Alternative tables (linear or compound linear) are available but rarely used. The companding table can be changed in‐flight if necessary. The nominal approach is to store the majority of images in the DEA flash memory in companded 8‐bit form (without compression or Bayer pattern interpolation), although it is possible to store acquired images in Bayer interpolated and JPEG or predictive compressed form. Data downlinked to Earth are, therefore, usually a copy of the onboard uncompressed parent image that has been interpolated and JPEG compressed in the DEA, or not interpolated and predictively losslesly compressed. The raw parent images are often held in the DEA until such time that the MSL science team is satisfied with the products received on Earth. In other words, if the team is unsatisfied with the compression results for a received image, a new product with a more suitable compression quality can be generated and sent to Earth days, weeks, or months later—for as long as the raw parent image remains in the DEA.

7.7.2. Uncompressed Data and Lossless Compressed Data

A full‐size raw, companded but uncompressed Mastcam or MARDI image is about 2 Mb in size. Downlink of uncompressed images from Mars is rare. Even more rare, for calibration purposes, 12‐bit image data can be obtained by commanding the instrument to return a 16‐bit image. This “calibration mode” is not intended for use on Mars because the data cannot be compressed and image size cannot exceed 1648 by 632 pixels (2 Mb) owing to DRAM limitations. Thus, two 16‐bit images (4 Mb of uncompressed data) are necessary to cover the full CCD.

The 8‐bit data can be returned with lossless compression when scientific need requires and downlink data volume permits. Lossless data compression is approximately 1.3–1.5:1 and Bayer interpolation is not performed by the camera on board the rover. Rather, the R, G, and B pixels are separated and collected into color arrays, and the predictive (replaces actual pixel value with a value determined by taking the difference between it and the preceding pixel that, generally being a small number, compresses better) compressor applied to each array. The results are concatenated into a single image file and are then ready to be transferred to the rover.

7.7.3. Lossy Compressed Data

Most Mastcam and MARDI images are downlinked as color JPEGs; when downlink data volumes permit, a subset is downlinked a second time with lossless compression, as described above. To create the JPEG compressed products, the first step is to perform a RGB Bayer pattern interpolation based on the method of Malvar et al. [2004]. The interpolation kernels are manipulated to directly produce luminance (Y) and chrominance (C R, C B) color space components. Then, a color JPEG image is created using a commanded compression quality between 1 (most compression) and 100 (least compression) and a Y:C R:C B color subsampling scheme, usually either 4:4:4 (no color subsampling) or 4:2:2 (chrominance horizontal subsampling by a factor of 2). It is also possible to create a gray scale JPEG (luminance channel only, no chrominance). As data volume is reduced when 4:2:2 is used instead of 4:4:4, and the color difference is usually not noticeable, 4:2:2 is used most often.

7.7.4. Video Compression

The Mastcam and MARDI video command can be used to acquire a series of individual images or it can be used to take short videos. If the video command states that the images are to be obtained and stored in raw uncompressed form or in lossless compressed form, then the command results in storing each individual video frame as a single image. If the video command states that the images are to be lossy compressed, however, then one data product is produced for every 16 images acquired. This is a JPEG‐compressed group of pictures (GOP). Like other image acquisitions, commanding video products involves making decisions about trade‐offs between image dimensions, compression quality and Y:C R:C B subsampling. These Mastcam/MARDI videos are essentially Motion JPEG (or M‐JPEG) files. One advantage of acquiring videos so that individual files are stored is that a thumbnail can be generated for every image in the video, and a miniature version of the video constructed prior to downlinking all full‐size images. For a GOP, a thumbnail of only the first image in the GOP can be returned.

7.8. Stereoscopy

An important capability of the Mastcam is to acquire stereoscopic images. Ground Data System Photogrammetric software using stereoscopic data can be used to compute range maps that in turn can be transformed to triangulated integrated networks (meshes), and eventually digital elevation models, and orthographically projected texture maps. This is the primary method by which range and topography are extracted from Mastcam data (Figure 9). The images can also be viewed anaglypically.

Figure 9.

Figure 9

A stereo mosaic at various stages of processing. Acquired on sol 1051, the figure shows (left bottom) a gray scale version of the mosaic, (left top) a gray scale range map derived from the stereo products, and (right) an anaglyph (viewed with glasses with red left, cyan right, lenses) of a 3‐D rendering of the digital elevation and texture maps. The most distant features in this mosaic are more than 50 m away and almost 20 m lower in altitude. Images 1051M00462400010306074 through 1051ML0046240120306081 and 1051MR0046240010104584 through 1051MR0046240110104619.

7.9. Mastcam Focus Stack and Range Maps

A focus stack (also known as a z‐stack, in which z refers to the camera's optic axis) is a composite of multiple images focused at different distances within a given scene. The onboard focus stack acquisition and merge capabilities were developed primarily for MAHLI [Edgett et al., 2012]. Owing to the commonality of hardware and flight software, the Mastcams also can acquire focus stacks and perform onboard focus merges. A by‐product of this technique is a second image product, a range map, that identifies the portions of each image—acquired at a different z axis distance—that were incorporated into the best‐focus product.

Focus stacks can be acquired and the individual images can be relayed to Earth, or a focus merge can be performed on board by the instrument.

Mastcam focus stack acquisition was extremely rare during the first 2.3 Mars years of surface operations. Figure 10 shows an example. Onboard merges were only performed five times, once on sol 193 with data acquired that sol, and four times on sol 1052 with images obtained on sol 1051. In all five cases, these were M‐100 data. The M‐34 has sufficient depth of field that focus stacks do not contribute much to their appearance, although the M‐100 depth of field is much narrower and the M‐100 can, especially in areas of large, flat distances, benefit from focus stacking.

Figure 10.

Figure 10

Mastcam‐100 onboard basic focus merge products for a geologic target named “Lion,” a slope‐forming, finely laminated mudstone mantled with dark gray cobbles and pebbles of a crossbedded eolian sandstone near the rover team's Buckskin drill site. These are products 1052MR0046290000104622R00 and 1052MR0046290000104623S0; the parent images were acquired on sol 1051 and were also subsequently downlinked. They are consecutively acquired images 1051MR0046240110104612C00 through 1051MR0046240110104619C00. In range maps as computed by the onboard software, bright is near and dark is distant.

7.10. Thumbnail Images

Thumbnail images are small color or gray scale versions of parent Mastcam or MARDI images. Each is ~8 times smaller than the parent in linear dimensions. Their main purpose is to provide a visual index of data stored in the DEA flash memory. They are downlinked at a higher priority than full‐scale images. In cases for which the onboard parent image is uncompressed, the thumbnail contains information that describes the compressibility of the parent image; this can be helpful in determining the JPEG compression quality to assign to the future return of that image in lossy compressed form. Nominally, thumbnails sent from Mars are 4:4:4 color JPEGs with a high commanded compression quality, typically quality 95.

As thumbnail images are usually JPEG‐compressed color images, the number of pixel rows and columns are whole number multiples of 16. This means that the largest thumbnails, for parent images of 1600–1648 by 1200 pixels, are 192 by 144 pixels in size (i.e., slightly less than 8 times smaller in linear dimensions than the parent). The details of this computation are shown in Table 1.

Each image acquired as part of a focus stack is stored in its respective Mastcam DEA as an individual parent image. Thus, each image in a focus stack generates a thumbnail. Indeed, instrument operators can inspect these thumbnails before deciding how to proceed with downlinking the focus stack—do they want to choose a few frames and relay them to Earth, or do they want to select up to eight consecutive frames for an onboard focus merge? When created, focus merge products—a best‐focus image and a range map—are immediately stored in the DEA. Each of these two products generates a thumbnail. When images acquired using the video command are stored individually, each parent produces a thumbnail. For video GOPs, only the first frame in a GOP produces a thumbnail image.

8. Operations

8.1. Mission Operation Introduction

The MSL mission operation approach is closely based on the Mars Phoenix (PHX) 2007 approach [Bass and Talley, 2008; Eppler et al., 2013] that in turn closely followed the Mars Exploration Rover (MER) 2003 approach [Mishkin et al., 2006, Bass and Talley, 2008]. A key highlight of the general approach includes operating on Mars time (in which work hours cycle around the Earth's 24 h clock by the 40 min difference between the Earth and Mars day, losing a full day every 36 days). However, owing to human factor issues (cycling around the clock wrecks havoc with human circadian rhythm and creates other life factor stresses) operating on Mars time is not sustainable for more than a small number of months. It is used early in missions when concern is highest regarding the lifetime of the mission hardware.

During Mars time operations, efforts are divided between longer‐term planning, termed “strategic,” and shorter‐term (every Mars day, called a “sol”) planning and execution, termed “tactical” [Mishkin et al., 2006; Bass and Talley, 2008]. The tactical process is generally limited by the time required to assess new data as they are received, plan the succeeding sol's activities, prepare commands to implement the succeeding sol's plan, and uplink the commands, all of which must fit in the time between the receipt of the new data and the start of the next sol's work period. Operations on Earth generally coincide with the time the vehicle was “sleeping” on Mars. Initially, the tactical process took about 18 h during the MER mission [Mishkin et al., 2006]; the PHX Lander tactical process took about 12 h.

After operating on Mars time for some period, it becomes necessary to return to Earth time operations (owing to its short duration, the PHX mission remained on Mars time). To do so requires reducing the tactical timeline to 10 h. One key to operating on Earth time is defining periods termed “unrestricted sols” when the normal tactical process including basing that process on fresh data can be applied to every sol, and accepting that some sols will not be planned using fresh data (this limits activities that require fresh data, such as driving or using the robotic arm), termed “restricted sols.” Some flexibility in when Earth‐time staffing shifts begin and end (e.g., starting at 6 AM or ending at 8 PM) during the period of unrestricted sols expands that period [Mishkin et al., 2006; Bass and Talley, 2008]. A second key to operating for indefinitely long missions is to shift more effort into strategic planning, which takes the same approach of planning restricted sols and expands it to defining longer‐term goals that feed into the tactical effort [Mishkin et al., 2006; Eppler et al., 2013].

8.2. Surface Operations

Abstracting from Mishkin et al. [2006] and Bass and Talley [2008], the four components of operations are

  1. science planning (determining what science is going to be investigated and how the instruments will be used),

  2. sequencing (creating the commands that tell the instrument what to do and when),

  3. uplink (the process of putting a full plan of rover and instrument sequences together, validating the plan and the commands, and sending them to the rover), and

  4. downlink (a multifaceted effort that involves the transmission of the data from the rover, either directly to Earth or through an Orbiter relay, its receipt on Earth, verification and validation of the health of the instrument, and processing and dissemination of the processed science data to the science team).

8.3. Science Planning

The flow of science planning has two input branches:

8.3.1. Strategic Planning; Preplanning for the Next Sol on the Prior Day

On the afternoon of the previous day, Project science team members and engineers create a skeleton tactical plan. It is based on guidance laid out by the strategic plan established by the Project Science Group, and general discussion within the science team, as well as expectations of engineering parameters (power, available downlink, and available uptime for the rover computer) based on the results of that previous day's tactical plan. This so‐called “N + 1 plan” is expected to change: by a little if the tactical plan executes nominally, significantly if new science objectives are discovered or decided upon at the new rover location, or completely if the rover's final position or further progress deviate from the expected plan or an anomaly is encountered, or if an alternate science‐decision branch is chosen that dictates such change.

8.3.2. Tactical Planning Based on Sol “N − 1” Data Received in Time for Planning

A day of tactical operations work begins with downlink: new data are received from the instruments, processed, analyzed, and then used to help the MSL Science Team plan the next one or more sols of science activities. This planning occurs in Science Theme Groups that specialize in specific disciplines (Geology, Atmospheric/Environment), and each of their science activity plans are eventually integrated and adjudicated in the Science Operations Working Group meeting. The integrated plan is reviewed and finalized during the Activity Planning and Approval Meeting, after which time the command sequences are finalized, checked, and uplinked [Mishkin et al., 2006; Bass and Talley, 2008].

8.4. Sequence Development and Uplink

During sol N planning, the individual rover and instrument plan fragments are discussed by the scientists, with input and guidance from the instrument engineers. The “sol N” plan includes most of the information needed to command the instrument being defined, and therefore, Mastcam sequencing can often begin before the final plan is approved (as long as what is actually sequenced achieves the plan). During sequencing, camera parameters for each individual image are established (e.g., exposure, pointing, color filters, and compression). Each specific planned activity is sequenced separately and eventually combined by the project Science Planner into a coherent, integrated master sequence that includes the commands for all the instruments and the rover [Bass and Talley, 2008]. The sequences must pass multiple checks and reviews, culminating in a Command Approval Meeting, after which the rover and instrument commands are staged for uplink to the spacecraft, and eventually transmitted to the rover either directly or through a relay orbiter [Mishkin et al., 2006; Bass and Talley, 2008].

8.5. Downlink

The rover can transmit data multiple times per day to the available orbiters, usually twice per day to MRO and Odyssey and less frequently to MAVEN. The data volume throughput provided by each orbiter depends on its relay capability (MRO has a more sophisticated relay radio than Odyssey) and the geometry of the rover‐orbiter relay link (directly overhead yields long, large downlink passes, orbits grazing the horizon give short, small downlink passes). After the data are transmitted to the orbiters, they await an opportunity for the orbiter to relay the data to Earth (usually after a few tens of minutes up to a few hours of latency), at which point the rover data are transmitted by the orbiter to the Deep Space Network (DSN) stations. Upon entering the DSN and JPL ground data systems, data are processed to extract the science data packets from the combined spacecraft telemetry stream, segregated by instrument, and for Mastcam and MARDI, received by the MSSS Operations staff and prepared by them for analysis by the science team.

The Mastcam downlink process retrieves the raw data from JPL's servers, and processes the images for eventual return to JPL servers for the science team to access and use in science planning and data analysis. Downlink operations personnel evaluate both spacecraft and instrument health and welfare telemetry to assess the status of the instruments.

Other downlink efforts involve managing the camera internal buffer memory resources (tracking how many images are residing in the buffer, how much space has been used, how much space is available, and how many images can still be taken without deleting previous data). These personnel also monitor the data resident in the rover's nonvolatile memory, into which Mastcam and MARDI data are transferred just prior to transmission to Earth. There is a constant interplay between camera buffer memory and the rover memory that needs to be monitored closely and managed owing to the statistical fluctuations in the downlink data volume.

The ultimate downlink effort is preparing materials for archiving with the Planetary Data System. This occurs in an episodic flow, based on a negotiated delivery schedule.

9. MARDI Characterization and Calibration

9.1. Background

As noted previously, the descope of the MARDI impacted its calibration. The MARDI was tested and calibrated at MSSS in June–July 2008, and included an abbreviated thermal vacuum test; additional geometric calibration occurred after delivery during rover ATLO camera calibration activities in December 2010.

9.2. Image Detection

9.2.1. Detector Selection

Based on manufacturer‐provided data describing voltage requirements and minimal defective pixels, we selected 16 KAI‐2020CM CCDs from a lot purchased for the MSL MARDI, MAHLI, and Mastcam effort as candidates for integration with the flight unit cameras. We subjected these 16 detectors to bench‐level, ambient, and 160 h at 125°C testing (per MIL‐STD‐883 Method 1015 for Class B electronic parts, MIL‐STD‐883 [1996]) to qualify them for spaceflight and rank them, in a relative sense, with regard to their performance. Ultimately, the detector we selected for the flight MARDI was one with the fewest defective pixels (two), called k6999.

9.2.2. System Gain, Full Well, Read Noise, and Linearity

We characterized the gain (electrons (e ), per DN), read noise (e ), and full well capacity (e ) of the MARDI signal chain by imaging a diffuse illuminated integrating sphere target and subsequent analysis of photon transfer curves using the technique described by Janesick et al. [1987]. The uncompressed linearly companded (8‐bit) image data were acquired at room temperature in a cleanroom laboratory setting on 7 July 2008. We obtained pairs of images at exposure durations of 0, 5, 10, 20, and 50 ms at each lamp current of 0.5 A increments between 0.0 (dark) and 8.0 A. The photon transfer curve is shown in Figure 11: gain 126.452 (or 15.8064 12‐bit equivalent) e per DN, read noise 17.5071 e , and full well 29453.6 e .

Figure 11.

Figure 11

MARDI photon transfer curve and derived gain, read noise, and full well parameters from analysis of prelaunch calibration images acquired in early July 2008.

9.2.3. System Dark Current and Bias

We characterized the rate of accumulation of dark current (charge generated in the CCD from thermal background effects, or bias) as a function of temperature during preflight testing. Because of its fast readout rate and short exposure time, the bias is removed digitally by zero‐filling the companding look‐up table appropriately; values below the bias are thus clipped to zero if they occur due to inappropriate bias selection. MARDI EDL images exhibit very little dark current at typical flight operating temperatures (well below 20°C). More important is a constant noiseless DC offset that causes dark pixels to be at slightly positive values instead of zero. It is necessary to remove this offset for the square‐root companding to work properly, and it is a function of temperature and, more significantly, readout rate. For the MARDI EDL sequence, tests were done during the mission cruise phase (in the dark environment inside the aeroshell) to determine the DC offset in the maximum frame rate mode to be used. A value of 155 was ultimately selected. MARDI temperature during EDL was about −20°C and dark pixels, which were transmitted in each image, are at an average value of about 15 DN, so this adjustment was mostly successful.

During surface operations, MARDI is operated at a lower frame readout rate and the DC offset is lower in this mode at 126 DN. For the long exposure times required for twilight imaging, the main dark current effect is a significant number of hot pixels. These have been mapped and can be removed, for some of the images that have gone though our research image processing pipeline.

9.3. Focus and Resolution

9.3.1. System Focus

MARDI is a fixed‐focus system; focus was set during optics integration using a lappable aluminum spacer, based on bar target image quality for a target at about 7 m distance. Confirmation of infinity focus was performed imaging with MARDI in a cleanroom environment and pointed through two windows at the outside, sunlit environment at targets at about 55 m distance (Figure 12).

Figure 12.

Figure 12

Portions of a MARDI infinity focus and solar illuminated color characterization image acquired on 7 July 2008. MARDI was in a cleanroom environment and was pointed through a window, toward a window in the next room, to obtain views of blue sky, green palm fronds, parked automobiles, and three persons, two of whom were holding a Macbeth ColorChecker® color chart. (a) Portion of the image that has been lightened to make the two windows visible. (b) Enlarged portion of the same image with no calibration, no color adjustment applied. Neither view is geometrically corrected. These images demonstrate that MARDI faithfully reproduces color as seen by the human eye under Earth solar illumination conditions.

9.3.2. System Modulation Transfer Function

System modulation transfer function (MTF) refers to the contrast and resolution performance of the fully assembled camera head. After assembly, prior to delivery, MTF measurements were made using four different techniques: two were based on measuring the spatial frequency at which 50% contrast was achieved (the USAF 1950 test target and the Koren 2003 test target), and two determined the MTF curve using high‐contrast edges (NIH Image‐J MTF plug‐in and a Mars Orbiter Camera bar target analytical tool). These techniques all gave a spatial frequency of 0.3 at 50% modulation, and the MTF curves intercepted the Nyquist frequency at around 0.25% modulation. The MTF curves for the two Mastcams, MARDI, and the Juno camera are shown in Figure 13. During ATLO activities in January 2011, with MARDI mounted on the Curiosity rover and the rover in a cradle rotated upside down, the camera‐acquired images of a suite of resolution targets suspended at a distance of 8.6 m above the rover (Figure 14). We applied the same multiple analysis techniques and determined the MTF at Nyquist to be ~0.25 modulation, and a spatial frequency of 0.33 at 50% modulation, consistent with the previous measurements.

Figure 13.

Figure 13

Modulation transfer curves for three MSL cameras and Junocam. Junocam uses an optical design nearly identical to that of MARDI, and the same detector as the MSL cameras but without the Bayer color filter array (CFA). It is known that Bayer CFAs reduce the spatial frequency response at Nyquist by a factor of 2 [Yotam et al., 2007]. Spatial frequency is the number of light‐to‐dark transitions per unit distance. For discrete sampling, the highest frequency achievable is 1 cycle per sample pair or 0.5. In practice, MTFs use pixels as the measure of distance.

Figure 14.

Figure 14

Interpolated Bayer color image of Targets on Overhead Crane Drip Shield in the Spacecraft Assembly Facility (SAF) High Bay at JPL. Each side of the hexagonal drip shield is about 1.05 m in length, with opposing faces ~2.5 m apart. The target was at a height above the camera of about 8.6 m. Spatial resolution (50% modulation) was about 20 mm (or three times the pixel sale of ~6.5 mm).

After landing, the MARDI view of the surface of Mars is, as expected, out of focus. Figure 15 shows a rock sample imaged in a laboratory with the MARDI at 70 cm distance (top middle frame), the same rock imaged with a consumer Bayer pattern color camera in focus (bottom left frame), and examples of this latter image processed with a Gaussian blurring filter of various pixel dimensions. Visual comparison of the frames in the upper tier of this figure suggests that the amount of Gaussian blurring is about 1.5 pixels. Applying that blurring to representative targets and applying the same four quantitative techniques to examine MTF and resolution indicate a spatial frequency of 0.1 at 50% modulation, or roughly 3 times poorer quality than in‐focus images.

Figure 15.

Figure 15

(bottom left) Icelandic basalt cobble imaged by MARDI at ~70 cm (approximate distance of camera from ground while Curiosity is on the surface), top row center, and as imaged by a Nikon D‐70, consumer camera with many similarities to the MARDI, and matched to scale if MARDI were in focus at 70 cm. Other images show the Nikon image subjected to Gaussian blurring filters between 1 and 2.5 pixels in scale. Visual inspection shows that the MARDI image quality appears to fall between the 1 pixel and 1.5 pixel smeared images, consistent with a resolution of 1.5 mm.

9.3.3. Scale at Boresight

For many applications, the scale of a MARDI image can be determined by using the IFOV at the boresight, measured as 0.76 mrad in prelaunch testing. The image scale at the boresight at distance d is d x (0.76 × 10−3). Because of the distortion of MARDI's wide‐field optics, the scale will change across the field; the MARDI camera model and geometric correction (section 9.6) can accommodate this.

9.4. Radiometry

9.4.1. Flat Field

With its wide‐angle lens, transmission of light to the CCD falls off radially from the MARDI optic axis. An illustration of the MARDI flat field product archived with the NASA PDS is shown in Figure 16. This product is a composite of six MARDI images in which the camera FOV was completely filled by a Spectralon target lit by artificial illuminators (halogen; fluorescent) in a cleanroom setting on 2 July 2008. To create the product, the raw, uncompressed 8‐bits per pixel images were decompanded to 12 bits, corrected for dark current, and normalized for differences in exposure. The CCD's red, green‐1, green‐2, and blue microfiltered pixel arrays were treated independently. We applied no smoothing operators, filters, or spatial or local operators, and we made no corrections to bad pixels or optical obstructions in the source images. Essentially, the radiometric calibration of the MARDI was identical to that of the MAHLI [Edgett et al., 2015] and Mastcam [Bell et al., 2017]. Subsequent to delivery of MARDI to JPL, no further opportunities to acquire new or improved flat field data were available, and on Mars, MARDI cannot be pointed skyward for “sky flat field” images that are routinely acquired for the Mastcams and MAHLI [Bell et al., 2017; Edgett et al., 2015].

Figure 16.

Figure 16

MSL MARDI flat field image. For illustration purposes, the dynamic range has been adjusted, here. The archival product is called FLAT_MD_0.IMG and is available in the NASA PDS archives.

9.4.2. Lens Transmission and System Spectral Throughput

MARDI lens transmission as a function of wavelength is the composite of light passing through the lens element glasses, antireflective coatings, and the spectral filter designed to limit transmission to 399–675 nm. The transmission measured before launch, and applicable to sol 0 descent images, is nearly flat in this range (Figure 17). Of course, transmission was altered by deposition of a thin film of dust on the MARDI lens during the terminal descent landing events on sol 0; this change in transmission has not been characterized as MARDI cannot view any calibration surfaces. Using the nominal detector quantum efficiency (QE) from the manufacturer's data sheet and the measured transmission, the QE × T response is shown in Figure 17. These values were coarsely validated in instrument‐level ground testing using monochromator imaging.

Figure 17.

Figure 17

MARDI system‐level lens transmission and red, green, and blue response (quantum efficiency (QE) and times transmittance (T)) as a function of wavelength.

9.4.3. System Radiometric Performance

We confirmed that actual MARDI performance meets the expectation based on lens transmission and CCD quantum efficiency (QE) by imaging an integrating sphere and a calibrated quartz‐tungsten‐halogen lamp traceable to National Institute of Standards and Technology standards. Subtracting dark current, the observed signal level of the integrating sphere output is within 10% of prediction for all three Bayer filter band passes, with the actual levels slightly higher than predicted (Table 2).

Table 2.

MARDI Signal (electrons e ) Performance as a Function of Bayer Red, Green, and Blue Microfiltered Pixels Relative to Prediction

Band Predicted (e ) Measured (e ) Difference
Blue 8,828 7,749 12.2%
Green 18,377 16,683 9.2%
Red 29,825 27,338 8.3%

9.5. Color Adjustment and White Balance

White balance or color adjustments are generally a matter of individual preference or objective. Targets can look quite different, independent of calibration or other image processing, whether the scene is entirely in shadow, partially in shadow, in twilight illumination, or in full Sun. The terrain beneath the descending rover on sol 0 was in afternoon sunlight; after touchdown the scene is partially shadowed by rover hardware and the lens had likely become coated by a film of dust. Subsequent imaging by MARDI during the course of the surface mission was preferably, though not always, performed with twilight illumination in order to minimize the large contrasts between sunlit and shadowed surfaces. These high contrasts also resulted in light scattering off the dust on the lens, reducing the contrast in each area. The uniform illumination afforded by twilight lighting greatly improved the clarity of the images (Figure 18).

Figure 18.

Figure 18

Illumination effects on MARDI imaging. (left) The image (0553MD01830000101677E01) was taken about 40 min after sunset, illuminated by diffuse sky brightness. (right) The image (554MD01840000101678E01) was acquired shortly after noon. These highly processed images (including spatial filtering) show how poorly daylight imaging is compared to twilight imaging. The arrow points to the same rock.

MARDI data archived with the NASA PDS includes a product for which we have applied a color adjustment for qualitative use [Caplinger, 2013; Malin et al., 2015]. Figure 19 shows an example of the results. The MARDI color correction factors—1.06 for red, 1.00 for green, and 1.22 for blue—were computed using prelaunch images of a sample of beta cloth and standard color and gray scale targets (Macbeth ColorChecker® color chart) under Earth solar illumination. Flight images of the heat shield calibration target can be used for comparison (e.g., Figure 19), but these are restricted to the few frames in which it was visible but still shadowed by the backshell (immediately after heat shield jettison); once directly illuminated the target was saturated. The latter method ignores color contributions from scattering from the Martian atmosphere. (Comparable values for the Mastcam flight images are 1.2, 1, and 1.26.) A different set of multiplicative values (1, 1.16, and 1.86) can be used to create essentially white balanced color images for all cameras.

Figure 19.

Figure 19

MARDI color adjustment example from the 35th frame acquired during descent from an altitude of 9.75 km. (a) Original image received from Mars, which had been losslessly compressed on board, to which the same Bayer pattern interpolation as can be performed on board the instrument was applied after receipt. This is from NASA PDS archive EDR product 0000MD0000000000100035C00_XXXX. (b) Same image with color adjustment as applied to radiometrically corrected NASA PDS archive RDR product (_DRCX). (c) Same color processing as in Figure 19b and also geometrically linearized to adjust for the lens distortion (archive produce _DRLC). The heat shield is circular, 4.5 m in diameter, about 12.8 m from the camera, and tilted about 20°.

9.6. Geometric Calibration and Camera Model

The MARDI 90° FOV lens imparts noticeable “barrel” distortion to the image, so it was important to characterize this distortion. This was done in ATLO using a series of images of planar dot targets positioned underneath the rover and covering the MARDI field of view (e.g., Figure 20). This effort occurred in December 2010 and is described in Maki et al. [2012]. Two analyses of these images were performed.

Figure 20.

Figure 20

MARDI lens distortion correction. (a) Original calibration image acquired 17 September 2010. (b) Linearized (geometric corrected) product based on CAHV model parameters. These data were acquired with MARDI on board the Curiosity rover during ATLO testing at JPL‐Caltech. Note the rover's front left wheel at the bottom right. Dot target is 55.88 cm square; each dot is 2.54 cm in diameter. Images have been color‐corrected using radiometric calibration data. MARDI ATLO calibration image MrdiImage_0348139331_75996‐1.

One analysis, performed at MSSS, produced interior orientation (center pixel and radial distortion) coefficients that allow the raw image to be geometrically resampled to the image that would have been produced by a camera with no distortion (Figure 20b). This is a CAHV model for a simple camera with a rectangular detector and fixed focal length. Each component is a three‐dimensional vector. A camera model mathematically maps a single three‐dimensional position in object space to a two‐dimensional point in the image.

C

Camera position

A

camera pointing Attitude (actually points into the camera)

H

Horizontal pointing vector toward the right of the image (actually more complicated than this)

V

Vertical pointing vector (down).

These geometrically corrected products (Figure 19c), archived with the PDS, can be used by a simple “pinhole” or four‐vector CAHV camera model; this is likely easier for most users to process and analyze. Table 3 summarizes these interior orientation coefficients.

Table 3.

MARDI MSSS‐Produced Camera Model Coefficients

LENS DISTORTION COEFFICIENTS
k 1 k 2 k 3
−3.589522e −3 1.828246e −5 −5.040188e −8
INTERIOR ORIENTATION COEFFICIENTS
Boresight (pixel location) Affine coefficients (pixels per mm)
i 0 (column) j 0 (row) a 11 a 12 a 21 a 22
588.381 819.047 135.154157 −0.038589 0.0 135.135135

A second analysis, by JPL personnel, created a six‐vector CAHVOR camera model following the same methodology reported for the MSL Navcam and Hazcam instruments by Maki et al. [2012]. (The CAHVOR acronym refers to the vectors that permit transformation from object to image coordinates: C is the camera center vector, A is the axis, H is horizontal, V is vertical, O is optical, and R is the radial distortion vectors [Yakimovsky and Cunningham, 1978; Gennery, 2001; Di and Li, 2004].) This analysis followed the procedure described by Yakimovsky and Cunningham [1978] as modified by Gennery [2001, 2006]. The CAHVOR parameters for each image accompany each of the MARDI data products, except for the geometrically corrected products, archived with the NASA PDS. This model can be used directly with uncorrected images if CAHVOR‐aware software is available.

Basically, delivering to the PDS linearized (camera‐distortion corrected) RDR images, permits simpler models to be used to mosaic or extract photogrammetric information from the images, while the somewhat more complicated CAHVOR model permits users to perform similar functions on uncorrected images.

9.7. Stray and Scattered Light

Design of the MARDI lens used standard best practice to minimize stray light sensitivity. The lens has a conical sunshade that prevents direct illumination of the MARDI front optical surface by sources more than 72° off axis. The interior surface of the sunshade was grooved and black anodized to further reduce the scattered light sensitivity. At the time and location of the MSL landing, the Sun was 34° above the horizon. For that Sun elevation, the MARDI sunshade prevents direct illumination of the front surface for off nadir attitude excursions of up to 52°.

The MARDI design was not analyzed or tested for stray light susceptibility. However, based on analyses performed for the similar OSIRIS‐REx TAGCAMS lens design, the stray light signal in the MARDI images was very likely less than 1%.

10. Concluding Statements

The MSL mission is still underway, as of 1 January 2017 having exceeded 15 km traverse distance during 2.3 Mars years of nearly constant operations. Mastcam remains the science imaging workhorse of the mission, with more than 45,000 M‐34 and 35,000 M‐100 images acquired. Mastcams have an average acquisition of about 50 images per sol of activity.

Neither the two Mast cameras nor the MARDI show any evidence of degradation in performance from their exposure and operation on Mars. MARDI acquired a dust veneer on its front element during EDL, and accumulation of dust since then has been difficult to assess because MARDI has no calibration target. Exposure times have not changed appreciably. The Mastcam observations of the calibration target show no demonstrable change during the mission (other than those attributable to the change; i.e., dust accumulation on and removal from the calibration target). No mechanical and no electronic issues have been observed. Detector performance remains unchanged. The Mastcam mechanisms have shown no sign of wear, no incidences of skipping motor counts, or other indications of the potential loss or uneven redistribution of lubricant. The Mastcam flight software is designed to prevent commanding the cameras to harm themselves, so it is our expectation that the cameras will continue to perform at their present levels for the foreseeable future.

Acknowledgments

Funded by the U.S. National Aeronautics and Space Administration (NASA), the Mastcam and MARDI investigations were developed (2005–2011) and operated (2011–present) under California Institute of Technology (Caltech) Jet Propulsion Laboratory (JPL) subcontracts 1269421, 1273887, and 1516826 to Malin Space Science Systems of San Diego, California. JPL‐Caltech manages MSL for NASA and part of this research was carried out at JPL. Hundreds of people ultimately contributed to the development of Mastcams, MARDI, and the capabilities to deliver and move the cameras around Curiosity's field site on Mars. We heartily thank all of them, although we regret we cannot name them all here. First and foremost, we thank our spouses, partners, and families for their support and sacrifices during the >12 years since we first began working on Mastcam and MARDI. In particular, we thank MSSS engineering support for Mastcam and MARDI development provided by Scott Brylow, Paul Otjens, Chris Martin, Hakeem Olawale, and Matt Clark; MSSS operations and science staff Jason K. Van Beek, David Harker, Daniel Krysak, Leslie Lipkaman, Brian Nixon, Robert Zimdar, Angela Magee, Tex Kubacki, Carrie Kubacki, Megan Kennedy Wu, Burt Baker, Jennifer Sandoval, Kim Supulver, Liliya Posiolova, and Bruce Cantor; MSSS Information Technology staff S. Davis, K. Stoiber, and E. Balentine; and MSSS administrative personnel, including the efforts of Michael Mullenniex, Linda Maze, and Diana Michna. We further thank the engineers at Alliance Spacesystems (now the Space Division of MacDonald Dettwiler and Associates' MDA Information Systems), particularly Jason Bardis, Jacques Laramee, Brett Lindenfeld, Rius Billing, Daniel DiBiase, Helen Aslanian, Richard McKenzie, Richie Gov, Sean Dougherty, Jennifer Baker, Cole Corbin, and Todd Cameron; onboard focus merge software development by T. Lesperance; Dean Szwabowski for the mechanical design support; the MSL Project Scientists Ashwin Vasavada (2015–present), John Grotzinger (2007–2014), Ed Stolper (2005–2007), and Frank Palluconi (2004–2005); the MSL Deputy Project Scientist Joy Crisp; MSL Project Managers Pete Theisinger, Richard Cook, and Jim Erickson; the MSL Program Scientist Michael Meyer; Deputy Program Scientist Mary Voytek; the JPL MSL Payload Manager, John J. (Jeff) Simmonds, and the Payload Office staff, particularly the Mastcam and MARDI instrument engineers, Reg Willson (2004–2010) and Aaron Sengstacken (2010–2012); JPL MSL development and testing staff including Daniel Limonadi, Art Thompson, Jessica Samuels, Dave Gruel, Becky Henninger, Betina Pavri, Sean Howard, Dave Johnson, Mike Johnson, and many others; instrument testing support from Toby Le and Kevin Shannon of Cubic Defense Systems; JPL MSL operation planning leads Michael Watkins and Nicole Spanovich; L. Keely, D. Lees, Ara Nafian, and Lorenzo Fluckiger at NASA Ames for planning and analysis software development; the Mastcam/MARDI Instrument Preliminary Design (I‐PDR) Review Board T. Fraschetti (chair), J. Baker, G. Fraschetti, P. Wu, W. Harris, R. Kemski, W. Mateer, G. Reeves, F. Vescelus, R. West, J. Johnson, and P. Hardy; the Mastcam/MARDI Instrument Critical Design (I‐CDR) Review Board T. Fraschetti (chair), G. Fraschetti, H. Kieffer, W. Harris, C. Kingery, W. Mateer, G. Reeves, F. Vescelus, R. West, P. Hardy, and G. Kinsella; the Mastcam and MARDI Instrument Delivery Review Board, T. Fraschetti (chair), H. Kieffer, W. Harris, C. Kingery, W. Mateer, G. Reeves, R. Paynter, and A. Vasavada; and the Mastcam/MARDI Calibration independent review board N. Izenberg, J. Johnson, K. Klaasen, and R. West. The data used in this report are available from the Planetary Data System (PDS), including the calibration information that appears as part of the CALIB/subdirectory of the MARDI volumes, and the remaining calibration information is incorporated in the text and figures of this report.

Malin, M. C. , et al. (2017), The Mars Science Laboratory (MSL) Mast cameras and Descent imager: Investigation and instrument descriptions, Earth and Space Science, 4, 506–539, doi:10.1002/2016EA000252.

References

  1. Anderson, R. C. , et al. (2012), Collecting samples in Gale crater, Mars; an overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System, Space Sci. Rev., 170, 57–75, doi:10.1007/s11214‐012‐9898‐9. [Google Scholar]
  2. Arvidson, R. E. , et al. (2014), Terrain physical properties derived from orbital data and the first 360 sols of Mars Science Laboratory Curiosity rover observations in Gale crater, J. Geophys. Res. Planets, 119, 1322–1344, doi:10.1002/2013JE004605. [Google Scholar]
  3. Bass, D. , and Talley K. (2008), Phoenix surface mission operations processes, J. Geophys. Res., 113, E00A06, doi:10.1029/2007JE003051. [Google Scholar]
  4. Bell, J. F. III , et al. (2003), Mars Exploration Rover Athena Panoramic Camera (Pancam) investigation, J. Geophys. Res., 108(E12), 8063, doi:10.1029/2003JE002070. [Google Scholar]
  5. Bell, J. F., III , Joseph J., Sohl‐Dickstein J. N., Arneson H. M., and Johnson M. J. (2006), In‐flight calibration and performance of the Mars Exploration Rover Panoramic Camera (Pancam) instruments, J. Geophys. Res., 111, E02S03, doi:10.1029/2005JE002444. [Google Scholar]
  6. Bell, J. F., III , et al. (2017), The Mars Science Laboratory Curiosity rover Mast Camera (Mastcam) instruments: Pre‐flight and in‐flight calibration, validation, and data archiving, Earth Space Sci., doi:10.1002/2016EA000219. [Google Scholar]
  7. Bhandari, P. , Birur G., Pauken M., Paris A., Novak K., Prina M., Ramirez B., and Bame D. (2005), Mars Science Laboratory thermal control architecture, Soc. Automob. Eng. Int. Conf. Environ. Syst., Tech. Pap. 2005‐01‐2828, doi:10.4271/2005‐01‐2828.
  8. Caplinger, M. (2013), MSSS MSL camera calibration summary, version 1.1, NASA Planetary Data System archives. [Available at http://pds‐imaging.jpl.nasa.gov/data/msl/MSLMST_0013/CALIB/MSL_MMM_CAL.TXT, accessed 30 December 2016.]
  9. CCITT (1993), International Telegraph and Telephone Consultative Committee, Information Technology – Digital Compression and Coding of Continuous‐tone Still Images – Requirements and Guidelines – Recommendation T.81 (9/92), International Telecommunication Union.
  10. Davis, K. , Herman J., Maksymuk M., Wilson J., Chu P., Burke K., Jandura L., and Brown K. (2012), Mars Science Laboratory's dust removal tool, Aerospace Mechanisms Symposium 41, pp. 279–292, NASA Conf. Proc. NASA/CP‐2012‐217653.
  11. Di, K. , and Li R. (2004), CAHVOR camera model and its photogrammetric conversion for planetary applications, J. Geophys. Res., 109, E04004, doi:10.1029/2003JE002199. [Google Scholar]
  12. DiBiase, D. R. , and Laramee J. (2009), Mars Hand Lens Imager: Lens mechanical design, Proc. IEEE Aerospace Conf., doi:10.1109/AERO.2009.4839434. [Google Scholar]
  13. DiBiase, D. , Bardis J., and Billing R. (2012), A zoom lens for the MSL mast cameras: Mechanical design and development, Aerospace Mechanisms Symposium 41, NASA Conf. Proc. NASA/CP‐2012‐217653, 293–310.
  14. Edgett, K. S. , et al. (2012), Curiosity's Mars Hand Lens Imager (MAHLI) investigation, Space Sci. Rev., 170, 259–317, doi:10.1007/s11214‐012‐9910‐4. [Google Scholar]
  15. Edgett, K. S. , et al. (2015), Curiosity's robotic arm‐mounted Mars Hand Lens Imager (MAHLI): Characterization and calibration status, Version 2, 05 October 2015, MSL MAHLI Tech. Rep. 0001, doi:10.13140/RG.2.1.3798.5447.
  16. Eppler, D. , et al. (2013), Desert Research and Technology Studies (DRATS) 2010 science operations: Operational approaches and lessons learned for managing science during human planetary surface missions, Acta Astronaut., 90, 224–241, doi:10.1016/j.actaastro.2012.03.009. [Google Scholar]
  17. Gennery, D. B. (2001), Least‐squares camera calibration including lens distortion and automatic editing of calibration points, in Calibration and Orientation of Cameras in Computer Vision, edited by Grun A., and Huang T., pp. 123–136, Springer, Berlin, Germany, doi:10.1007/978‐3‐662‐04567‐1_5. [Google Scholar]
  18. Gennery, D. B. (2006), Generalized camera calibration including fish‐eye lenses, Int. J. Comput. Vis., 68, 239–266, doi:10.1007/s11263‐006‐5168‐1. [Google Scholar]
  19. Ghaemi, F. T. (2009), Design and fabrication of lenses for the color science cameras aboard the Mars Science Laboratory rover, Opt. Eng., 48(10), 103002, doi:10.1117/1.3251343. [Google Scholar]
  20. Ghaemi, F. T. (2011), Ultra high precision alignment of the elastomerically mounted elements of the science camera lenses for the Mars Science Laboratory (MSL) rover, Appl. Opt., 50(26), 5108–5114, doi:10.1364/AO.50.005108. [DOI] [PubMed] [Google Scholar]
  21. Grotzinger, J. P. , et al. (2012), Mars Science Laboratory mission and science investigation, Space Sci. Rev., 170, 5–56, doi:10.1007/s11214‐012‐9892‐2. [Google Scholar]
  22. Haggart, S. , and Waydo J. (2008), The mobility system wheel design for NASA's Mars Science Laboratory mission, 11th European Conf. of the Int. Soc. for Terrain‐Vehicle Systems, Torino, Italy.
  23. Huck, F. O. , Taylor G. R., McCall H. F., and Patterson W. R. (1975), The Viking Mars lander camera, Space Sci. Instrum., 1, 189–241. [Google Scholar]
  24. Janesick, J. R. , Klaasen K. P., and Elliott T. (1987), Charge‐coupled‐device charge‐collection efficiency and the photon‐transfer technique, Opt. Eng., 26, 972–980, doi:10.1117/12.7974183. [Google Scholar]
  25. Kornfeld, R. P. , Prakash R., Devereaux A. S., Greco M. E., Harmon C. C., and Kipp D. M. (2014), Verification and validation of the Mars Science Laboratory/Curiosity rover entry, descent, and landing system, J. Spacecr. Rocket., 51(4), 1251–1269, doi:10.2514/1.A32680. [Google Scholar]
  26. Le Mouélic, S. , et al. (2015), The ChemCam Remote Micro‐Imager at Gale crater: Review of the first year of operations on Mars, Icarus, 249, 93–107, doi:10.1016/j.icarus.2014.05.030. [Google Scholar]
  27. Leer, K. , et al. (2008), Magnetic properties experiments and the Surface Stereo Imager calibration target onboard the Mars Phoenix 2007 lander: Design, calibration, and science goals, J. Geophys. Res., 113, E00A16, doi:10.1029/2007JE003014. [Google Scholar]
  28. Landis, G. A. , and the MER Athena Science Team (2007), Observation of frost at the equator of Mars by the Opportunity rover, Proc. Lunar Planet. Sci. Conf. 38, 2423 pp.
  29. Madsen, M. B. , et al. (2003), The magnetic properties experiments on the Mars Exploration Rover mission, J. Geophys. Res., 108(E12), 8069, doi:10.1029/2002JE002029. [Google Scholar]
  30. Maki, J. N. , et al. (2003), Mars Exploration Rover engineering cameras, J. Geophys. Res., 108(E12), 8071, doi:10.1029/2003JE002077. [Google Scholar]
  31. Maki, J. , Thiessen D., Pourangi A., Kobzeff P., Litwin T., Scherr L., Elliott S., Dingizian A., and Maimone M. (2012), The Mars Science Laboratory engineering cameras, Space Sci. Rev., 170, 77–93, doi:10.1007/s11214‐012‐9882‐4. [Google Scholar]
  32. Malin, M. C. , Caplinger M. A., Carr M. H., Squyres S., Thomas P., and Veverka J. (2001), Mars Descent Imager (MARDI) on the Mars Polar Lander, J. Geophys. Res., 106(E8), 17,635–17,650, doi:10.1029/1999JE001144. [Google Scholar]
  33. Malin, M. C. , Edgett K. S., Cantor B. A., Caplinger M. A., Danielson G. E., Jensen E. H., Ravine M. A., Sandoval J. L., and Supulver K. D. (2010), An overview of the 1985–2006 Mars Orbiter Camera science investigation, Mars, 5, 1–60, doi:10.1555/mars.2010.0001. [Google Scholar]
  34. Malin, M. C. , Edgett K., Jensen E., and Lipkaman L. (2015), Mars Science Laboratory Project Software Interface Specification (SIS), Mast Camera (Mastcam), Mars Hand Lens Imager (MAHLI), and Mars Descent Imager (MARDI) Experiment Data Record (EDR) and Reduced Data Record (RDR) PDS Data Products, Version 1.3. NASA/Jet Propulsion Laboratory Document JPL‐D‐75410, SIS‐SCI035‐MSL.
  35. Malvar, H. S. , He L.‐W., and Cutler R. (2004), High‐quality linear interpolation for demosaicing of Bayer‐patterned color images, IEEE Conf. Acoust. Speech Signal Process., 3, 485–488, doi:10.1109/ICASSP.2004.1326587. [Google Scholar]
  36. Maurice, S. , et al. (2012), The ChemCam instrument suite on the Mars Science Laboratory (MSL) rover: Science objectives and mast unit description, Space Sci. Rev., 170, 95–166, doi:10.1007/s11214‐012‐9912‐2. [Google Scholar]
  37. Mishkin, A. , Limonadi D., Laubach S., and Bass D. (2006), Working the Martian night shift—The MER surface operations process, IEEE Rob. Autom Mag., 13(2), 46–53, doi:10.1109/MRA.2006.1638015. [Google Scholar]
  38. MIL‐STD‐883 (1996), Test method standard for microcircuits, United States Department of Defense, Revision E, December 1996.
  39. Novak, K. S. , Kempenaar J. E., Bhandari P., and Dudik B. A. (2012), Mars Science Laboratory rover system thermal test, Int. Conf. Environ. Syst., 42, AIAA‐2012‐3516, doi:10.2514/6.2012‐3516. [Google Scholar]
  40. Prakash, R. , et al. (2008), Mars Science Laboratory entry, descent, and landing system overview, Proc. IEEE Aerospace Conf., doi:10.1109/AERO.2008.4526283. [Google Scholar]
  41. Schorghofer, N. , and Edgett K. S. (2006), Seasonal surface frost at low latitudes on Mars, Icarus, 180, 321–334, doi:10.1016/j.icarus.2005.08.022. [Google Scholar]
  42. Smith, P. H. , et al. (1997), The imager for Mars Pathfinder experiment, J. Geophys. Res., 102(E2), 4003–4025, doi:10.1029/96JE03568. [Google Scholar]
  43. Sullivan, R. , Anderson R., Biesiadecki J., Bond T., and Stewart H. (2011), Cohesions, friction angles, and other physical properties of Martian regolith from Mars Exploration Rover wheel trenches and wheel scuffs, J. Geophys. Res., 116, E02006, doi:10.1029/2010JE003625. [Google Scholar]
  44. Vasavada, A. R. (2016), Where Curiosity has taken us, Eos, 97, 16–21, doi:10.1029/2016EO043009. [Google Scholar]
  45. Vasavada, A. R. , et al. (2014), Overview of the Mars Science Laboratory mission: Bradbury Landing to Yellowknife Bay and beyond, J. Geophys. Res. Planets, 119, 1134–1161, doi:10.1002/2014JE004622. [Google Scholar]
  46. Warner, N. , Silverman M., Samuels J., DeFlores L., Sengstacken A., Maki J., Scodary A., Peters S., Litwin T., and Metz B. (2016), The Mars Science Laboratory remote sensing mast, IEEE Aerospace Conf., doi:10.1109/AERO.2016.7500554. [Google Scholar]
  47. Yakimovsky, Y. , and Cunningham R. (1978), A system for extracting three‐dimensional measurements from a stereo pair of TV cameras, Comput. Graphics Image Process., 7, 195–210, doi:10.1016/0146‐664X(78)90112‐0. [Google Scholar]
  48. Yotam, E. , Ephi P., and Ami Y. (2007), MTF for color Bayer pattern detector, Proc. SPIE Signal Process., Sensor Fusion, Target Recognition, XVI 6567, 65671M, doi:10.1117/12.723140. [Google Scholar]

Articles from Earth and Space Science (Hoboken, N.j.) are provided here courtesy of Wiley

RESOURCES