“You’re really only very small”: The Google Earth Engine and the grand scheme of things.

Michael Shohet
Induction
Published in
8 min readAug 4, 2020

--

Landsat 8 in orbit. © NASA.

The Google Earth Engine overwhelms with its vastness. Even jaded by our incredible technological progress, the human mind still reels from this kind of unfettered access to planetary scale information. Talk about ego-death. As you composite jungles and polar ice caps, the self begins to recede. After a while, it slowly dissolves into the primordial abyss of a global consciousness, and you feel whatever George Harrison circa 1967 must’ve felt — “You’re really only very small, and life flows on, within you and without you.”

But such meditations are normal workflow for climatologists, oceanographers and geophysicists, who have been processing and interpreting environmental data for decades. Although satellite programs have been extremely important to the Earth sciences, remote sensing imagery is only one type of data addressed in research. This far-out information completes an already extensive set of methodologies, including analysis of physical samples, land surveys, and plant and animal tracking.

Nevertheless, satellite data is indispensable. It adds extreme-scale macro-observations to the arsenal of environmental knowledge mining. States and state changes of entire landmasses, glaciers and deserts, can be used to grasp the overwhelming, yet delicate reality of our world. Most importantly, we can finally witness the human works in the greater scheme of things, particularly the destruction we have brought on through deforestation, urbanization, abuse of water, zoological holocausts, gross industrial negligence and overreach. This confession can be used to correct our near-fatal mistakes through purposeful and precise determination that would lead to sober and cognizant decision-making and profound, conscientious action, devoid of cynicism and indulgence. It must be done, if life is to have any meaning at all.

Pixels.

The archives available through the Earth Engine go back to the early 1970s, when the first Landsat satellite was put into orbit. The earlier Landsat machines had three light bands that captured reflected light from the earth’s surface. One band captured the visible spectrum, which was divided into the red, blue, and green frequencies. The other two sensors were the near-infrared sensor, mainly used to detect and analyze vegetation, and the short-wave infrared sensor, which can identify man-made materials and filter through clouds. The photographs themselves were composed of pixels and each pixel indicated an area of 30–80 square meters of the planet’s surface. It was up to the scientists to determine what those pixels represented — Is it a forest fire? Is it a new town in the Mojave? Wasn’t there a river here only a year ago?

To determine what the pixels meant researchers developed interpretive techniques and algorithms. They stacked the transmitted images into layers and then observed how the same pixel behaved from one layer to the next. Any kind of change that a pixel exhibited from photograph to photograph was monitored and interpreted. This was extremely time and labor intensive, as the amount and the scale of the images were massive. The significance to be derived from the pixels closely related to the nature of available computing power.

Through the EE platform, Google offers this power with the company’s legions of computers connected in parallel. The EE is constructed according to a neoteric computing paradigm, which governs all calculations to be computed on the server (in this case, a myriad of servers.) In this way, a single user can work on thousands of images at a time across a circuit of connected computers. In addition to the scale capacity, the EE framework supplies the tools for the examination of the data — masking, mosaicking, common geospatial analysis algorithms, machine learning capabilities, and general development environments. Satellite data, which, in the past, took years of painstaking computation to analyze, can be processed and interpreted within weeks!

Wait a second; let’s back up a bit. Rgb? Pixels? Layers? Masks? Does it seem a little familiar? Well, probably anyone who has used any image-editing software in the last thirty years will have their own sensor tingling. Here we have our great revelation — The Google Earth Engine can be grasped as a gargantuan, virtual Photoshop, complete with value rendering, blending options, vector and raster formats, opacity control and colorization. Of course, the EE is a little bit more than that. For example, along with image manipulation, you can visualize snow cover rates in Scandinavia for the last twenty years, or you can train the EE to recognize city blocks in Cairo, or model hurricanes in the Caribbean. But considering our discovered Photoshop ancestry, the crux of the platform remains in a single element — the pixel.

Working with the Earth Engine.

Ok, let’s finally try it! You have to sign up first, which you can do through any Google account. After the registration, you get access to the datasets, the resources and the native, web-based IDE. The IDE is actually really nice and comes with a syntax-aware JS editor, all of the API documentation, sample scripts, Git interfaces, and pretty painless tools for debugging, testing and deployment. Welcome to the new lazy eval normal, I suppose.

Here we will use a dataset created with images obtained from the MODIS (Moderate Resolution Imaging Spectroradiometer) on the NASA Aqua satellite, the mighty and fierce descendant of the original Landsat. The current active MODIS has 36 spectral bands, and the dataset we will use provides pixel data from a band that identifies the presence of chlorophyll a in the oceans. Chlorophyll a reflectance usually indicates a vegetative biomass, such as algae and phytoplankton. We can use the Earth Engine to display the regions of the planet where chlorophyll a wavelengths are emitted. By doing this, we can track and study phytoplankton populations and observe the state of biodiversity in the Earth’s oceans.

First, let’s import the MODIS Ocean product dataset as an Earth Engine Image Collection and store it in a variable. Use the code editor in the middle window of the IDE to type, or paste the line

var modis = ee.ImageCollection("NASA/OCEANDATA/MODIS-Aqua/L3SMI");

Since the image collection spans years (i.e. thousands of images,) we should filter the collection to narrow down the time of interest. To do this, we provide the start and the end dates as arguments for the filterDate() method, and then run the method on the collection:

var modisFiltered = modis.filterDate("2019-04-01", "2020-05-01");

Now we select the spectral band that we wish to examine, the actual electronic light sensor on the satellite! In this case, we want to isolate the band named ‘chlor_a’. To do this, we use the select() method on our filtered collection and pass the band name as a list of strings.

var chlor_a = modisFiltered.select(["chlor_a"]);

Now we are ready to visualize the chlor_a band pixels on the map. Because we are visualizing only one band, the result will be monochromatic (three bands are needed to take on the red, blue and green values.) But for single band visualizations we can provide a palette of colors to colorize the pixels. The EE stretches the colors along the pixel values according to the amount of wavelength intensity. A minimum and a maximum can also be indicated to control the sharpness and the accuracy of the visualization.

But first, let’s center the map on the North Atlantic:

Map.setCenter(-27.36, 44.83, 3);

The first two arguments are the longitude and the latitude, respectively. The third argument is the zoom amount.

Then we add the layer that will render the very pixels from the satellite! Pretty exciting moment! Unlike the previous statements, this operation is performed on the client side. We call the Map.addLayer() method and pass the name of the object to be rendered (our filtered collection,) the color palette to be used, along with the minimum and the maximum values, and an arbitrary name of the layer to be used by the map.

Map.addLayer(
chlor_a,
{ palette: [ "darkblue", "lightgreen", "green", "darkgreen" ],
min: 0,
max: 6.5 },
"Chlorophyll a"
);

Notice that the colors are passed as an object, with the palette, the min, and the max as the object’s properties. The palette value is a list of colors, which starts with the color to indicate the min, and ends with the color for the max. So the clear sea with minimal amount of phytoplankton would be rendered as dark blue, then small to large amounts of phytoplankton would be rendered as appropriate values of green. For the max value we will use the number 6.5, because it gives a nice visualization, not because it is scientifically accurate. In reality, the amount of chlorophyll a in a body of water can be as high as 100 milligrams per cubic meter.

Ok, let’s see that plankton already! Just click the Run button in the top menu over the code editor and the composite should get calculated and fetched from the Earth Engine servers. It should look something like this:

What we are actually seeing is all of the images at once, mosaicked together and stacked, mimicking established geospatial methodology. To smooth the visualization out a little, let’s use the mean() function, which averages the intensities of each pixel across the images.

To do this, let’s apply the reducing function to our filtered collection and store the mean in a new variable. Then we just plug it into the Map.addLayer().

var chlor_aMean = chlor_a.mean();Map.addLayer(
chlor_aMean,
{ palette: [ "darkblue", "lightgreen", "green", "darkgreen" ],
min: 0,
max: 6.5 },
"Chlorophyll a mean"
);

We might have to wait a moment or two for our image to get returned. Remember, this is a mosaic of AN ENTIRE YEAR of photographs of THE ENTIRE PLANET EARTH at 80 square meters per pixel! Can you picture all those Google machines smoking? Once composited and reduced, it should look something like this:

Well, do you feel it now, my friend? Do you feel the unfathomable energies slowly embracing you from some enormous and mysterious darkness? You are George Harrison now. You are one with the plankton. “With our love we could save the world.”

Obviously, this is only the tip of the iceberg (which, incidentally, you can calculate and visualize with the EE and the MODIS glacier data.) The Earth Engine demands prolific development and dedicated study as a vital tool of knowledge. Environmental research has become a crucial battle line between mindful science and cynical ignorance, between compassionate, economically viable conservation and obtuse, gluttonous environmental exploitation. And once again we look to an enlightened technological remedy to mend the vicious wounds we have inflicted on our little living rock, and on ourselves.

You can see the full phytoplankton tracking app at the Google Earth Engine Apps site: https://mlshohet.users.earthengine.app.

All code is at my GitHub: https://github.com/mlshohet/phytoplankton.

References.

Brown, Jesslyn and Jennifer Rover. “Tracking Change Across Time and Space with LCMAP,” U.S. Geological Survey, July 15, 2020. https://www.usgs.gov/center-news/tracking-change-across-time-and-space-lcmap.

Harrison, George. “Within You Without You,” Sgt. Pepper’s Lonely Hearts Club Band, Parlophone/EMI, 1967.

Liu, J.G. “REMOTE SENSING | Passive Sensors,” Encyclopedia of Geology, 2005, Pages 431–439. Science Direct, https://doi.org/10.1016/B978-0-12-409548-9.02956-0.

--

--