project infinity research report_052310

66
Mother Ocean and The Garden of Eden All life on our planet came from the Ocean, the mother of us all. Both reptiles and mammals originally emerged from the Ocean to walk onto the Earth. Following our recent discovery of active chimney vents on the Ocean floors, we found numerous sulfur-based life forms, which thrive at extremely high temperatures and have likely been doing so for billions of years, thus virtually assuring scientists that the first life on the planet began in the Ocean. Our planet “Earth” is over two-thirds covered by Ocean. More than sixty percent of the human population lives near the Ocean’s coastline. Despite being land creatures, all terrestrial life depends on the Ocean for water. Every year, the Ocean’s coastal ecosystems provide $25 Trillion USD in seafood, minerals and energy products, which makes the Ocean the single largest natural resource for the world economy. Yet, despite all that it provides us, we have no decent map of the Ocean, which is the super- majority of our planet’s surface. Less than 5% of the United States Exclusive Economic Zone (EEZ) is mapped in high resolution, while it is estimated that an accurate map could yield the US as much as $1.4 Trillion USD in additional food, energy and mineral resources. We have accurate and complete maps of the Moon, Venus and Mars, but the majority of our own planet’s surface is covered in darkness and miles of seawater, and thus remains a mystery to us. For the thousands of years of human history that we have interacted with the Ocean, we have only had the capacity to wonder and awe at its vastness, power and beauty. We slowly developed the tools and skills necessary to exploit it, but only recently have reached a level where we can begin to understand and master, control and destroy our mother Ocean. Only in the last decade, has our technology advanced to the point that mapping the entire Ocean is finally possible. With the rising threat of global climate change, we must begin to plan our interactions with the Ocean much more carefully if our civilization and species is to survive and prosper. However,

Upload: garrick-sturgill

Post on 07-Jan-2017

25 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Project Infinity Research Report_052310

Mother Ocean and The Garden of Eden

All life on our planet came from the Ocean, the mother of us all. Both reptiles and mammals originally emerged from the Ocean to walk onto the Earth. Following our recent discovery of active chimney vents on the Ocean floors, we found numerous sulfur-based life forms, which thrive at extremely high temperatures and have likely been doing so for billions of years, thus virtually assuring scientists that the first life on the planet began in the Ocean.

Our planet “Earth” is over two-thirds covered by Ocean. More than sixty percent of the human population lives near the Ocean’s coastline. Despite being land creatures, all terrestrial life depends on the Ocean for water. Every year, the Ocean’s coastal ecosystems provide $25 Trillion USD in seafood, minerals and energy products, which makes the Ocean the single largest natural resource for the world economy. Yet, despite all that it provides us, we have no decent map of the Ocean, which is the super-majority of our planet’s surface.

Less than 5% of the United States Exclusive Economic Zone (EEZ) is mapped in high resolution, while it is estimated that an accurate map could yield the US as much as $1.4 Trillion USD in additional food, energy and mineral resources. We have accurate and complete maps of the Moon, Venus and Mars, but the majority of our own planet’s surface is covered in darkness and miles of seawater, and thus remains a mystery to us.

For the thousands of years of human history that we have interacted with the Ocean, we have only had the capacity to wonder and awe at its vastness, power and beauty. We slowly developed the tools and skills necessary to exploit it, but only recently have reached a level where we can begin to understand and master, control and destroy our mother Ocean. Only in the last decade, has our technology advanced to the point that mapping the entire Ocean is finally possible.

With the rising threat of global climate change, we must begin to plan our interactions with the Ocean much more carefully if our civilization and species is to survive and prosper. However, without a map, plans are baseless assumptions destined to fail. Maps have always formed the basis of all plans; military, commerce or scientific plans all begin with a map.

“You cannot manage what you do not measure.”- Peter Drucker

Page 2: Project Infinity Research Report_052310

History of Planet Ocean

Around 4.4 billion years ago when most of the colossal meteors stopped impacting the planet, Earth’s molten surface began to cool and harden. This cooling caused water in the atmosphere to condense into liquid, raining torrentially onto the hardened surface, creating the first Ocean.

The early Ocean water was very acidic with temperatures near 212 degrees. Carbon dioxide from the atmosphere then started dissolving in the new Ocean and combining with calcium to form limestone, which eventually caused the skies to clear, the planet to warm, and water to begin evaporating back into the atmosphere.

Fast-forward to 2 billion years ago, the Ocean had cooled to produce algae and bacteria, which in turn produced oxygen. Eventually, about 1 billion years ago, enough oxygen developed to create an ozone layer, which shielded earth from the sun burning exposed organisms.

From about 500 to 300 million years ago, the Ocean temperatures ranged from 68-104 degrees, and its chemical makeup became somewhat similar to our modern Ocean. This was about the time the first true land creatures, reptiles, began to appear. Then 250 million years ago, 90% of all species on the planet were killed. 50 million years later, the dinosaurs rose to rule the Earth. And again, some 65 million years ago, another catastrophe wiped out 75% of all life. Following the dinosaurs’ extinction, mammals started to dominate. About 38 million years ago, carbon dioxide in the atmosphere decreased and the planet began cooling and the deep Ocean filled with frigid water.

Finally about 20 million years ago, the atmosphere stabilized at 21% oxygen, 78% nitrogen, and less than 1% carbon dioxide and other gases, roughly what it is today. The continents continued separating. The poles formed large ice sheets, which caused further planet cooling due to reflection of the Sun and major sea level drops.

Over that last 20 million years, the global temperature has continued to fluctuate and sea levels have risen and fallen by hundreds of feet. However, only very, very recently have we seen compelling evidence that humanity’s output of carbon dioxide and other chemicals are changing the chemistry of the Ocean and the atmosphere to such degrees that we could produce dramatic and possibly catastrophic climate change.

The Ocean and planet will survive anything humanity can do; but the big question is will we survive the changes we cause to the Ocean, earth and the atmosphere? Until we understand more about our planet, more than 71% of which is Ocean, we cannot know the cumulative damage or final outcome of our actions from only the last century or two.

In the last few decades, we have spent tens of billions to map the Moon, Venus and Mars, yet less than two percent of the Ocean floor map is available in such high resolution. While interesting to study maps of other planets, our continued survival

Page 3: Project Infinity Research Report_052310

as a species depends on a better understanding of the Ocean, which should start first with the study of a high resolution map of its floor. It is difficult to understand what you cannot see.

Three Millennia of Man on the Sea

As far back as the 15th century BC, the ancient Phoenician civilization was dependant on the sea for food, travel and trade. Through their maritime trade, the Phoenicians spread the use of their alphabet to North Africa and Europe, where first it was adopted by the Greeks, and then later passed on to the Romans to become our shared Western alphabet.

For over 30 centuries, man sailed along the coasts and sometimes across sections of the deep seas for purposes of fishing, trade and conquest. Navigation was mostly by dead reckoning and movement was limited, either powered by oar or direct wind behind a sail.

In the 9th century, the Arabs invented the kamal, a crude device to measure latitude. However, it was not until the end of the middle ages that the invention of the mariner’s astrolabes and more importantly, the dry compass that the stage was set for the Age of Discovery, and Western Europeans ventures across into the great Ocean in the late 15th century.

For the first oceanic explorations, the Portuguese leveraged the compass, progressive new advances in cartography and astronomy and sailing ships, the most important being the creation of the caravel and later, carrack designs.

Page 4: Project Infinity Research Report_052310

Caravela Redonda /Square-rigged Caravel

Developed in the 15th century under the sponsorship of Prince Henry the Navigator of Portugal, the caravel was equipped with lateen sails allowing mariners to travel in almost any direction, which gave it both speed and the capacity for sailing windward. It was not until the caravel development that Western Europeans seriously considered Asiatic trade and oceanic exploration. They were the first ships that could move beyond coastal navigation and the relatively placid Mediterranean, Baltic, or North Sea, and sail safely on the open Atlantic.

The first crucial breakthrough came in 1488, when Bartolomeu Dias rounded the southern tip of Africa, which he named "Cape of Storms" (later renamed the Cape of Good Hope by the King of Portugal for marketing purposes), anchoring at Mosselbay and then sailing east as far as the mouth of the Great Fish River, thus proving that the Indian Ocean was accessible from the Atlantic.

Not until late in the 15th century, following the unification of the kingdoms of Castile and Aragon and the conquest of Granada from the Moors, did Spain become fully committed to searching out new trade routes and colonies overseas. Only then did Ferdinand II of Aragon and Isabella I of Castile decide to finally fund Christopher Columbus' expedition, hoping to bypass Portugal's lock on Africa by traveling west to reach the Indies.

Page 5: Project Infinity Research Report_052310

Through the exploitation of new navigational technologies, Columbus believed he could safely reach India. There was a further key element in the plans of Columbus, his knowledge of the trade winds.

A brisk wind from the east, commonly called an "easterly", propelled Santa María, La Niña, and La Pinta for five weeks from the Canaries. Columbus returned home by following prevailing winds northeastward from the southern zone of the North Atlantic to the middle latitudes of the North Atlantic, where prevailing winds are eastward (westerly) to the coastlines of Western Europe, where the winds curve southward towards Spain.

While Columbus was wrong about reaching India, he was right about a more vital fact: he could use the North Atlantic's great circular wind pattern, clockwise in direction, to get home. This knowledge was critical to his success, although it was not universally accepted as fact until 350 years later.

Soon after Columbus, in the early 16th century, the newly crowned Charles I of Spain funded Ferdinand Magellan’s voyage to reach India by going West. While Magellan was killed in the Philippines, and only one of the five ships with 18 of the original 237 crew members returned to Spain, it was the first successful circumnavigation of the planet.

The final result of Spain’s explorations was not only vast land acquisitions in the New World, but a massive influx of gold and silver into Spain exceeding US$2 trillion in modern terms during the course of the 16th century. It also resulted in most other European powers following suit, establishing colonies and trade around the world, leveraging and steadily improving oceanic travel. Exploration of the Ocean paid handsomely in nearly every way imaginable.

New Technology in the Age of Enlightenment

In the 18th century, two major advances in a period of only 4 years revolutionized the possibilities for traveling across the Ocean. While the kamal and astrolabes had been able to estimate latitude for centuries, it wasn’t until Sir Isaac Newton and a few others invented the octant in the 1730s that latitude precision was approaching.

In 1757, the marine sextant was invented and the manual calculation of latitude was perfected, taking the core concepts from the octant and slightly improving upon them.

Page 6: Project Infinity Research Report_052310

Marine Sextant

Then after years of worldwide scientific endeavor to solve the problem of longitude calculation, John Harrison in Britain invented the chronometer in 1761. By the end of the 18th century, a properly equipped ship pilot could now locate his position on the open Ocean with reasonable accuracy without having to rely on dead reckoning.

So now that the sailor knew both where he was and where he wanted to go, the next challenge became deciding the best way to get there. However, this was much more a problem that it might seem, for while cartographers had developed decent maps of the Earth by 1800, no one had mapped the Ocean!

The Father of Modern Oceanography

In a brief period of only 20 years, from 1840-1860, humanity’s understanding of the Ocean was revolutionized by a handful of men. The first and most influential, Matthew Fontaine Maury, was nicknamed Pathfinder of the Seas and Father of modern Oceanography and Naval Meteorology and later, Scientist of the Seas. He was an American astronomer, historian, oceanographer, meteorologist, cartographer, author, geologist, and educator. While a mere lieutenant in the United

Page 7: Project Infinity Research Report_052310

States Navy, Maury led an international revolution with regard to the understanding of efficient Ocean travel.

In 1842, Maury became the first superintendent of the United States Naval Observatory, holding that position until his resignation in April 1861. The observatory's primary mission was to care for the United States Navy's marine chronometers, charts, and other navigational equipment.

As a Navy sailor, Maury noted that there were numerous lessons that had been learned by shipmasters about the effects of adverse winds and drift currents on the path of a ship. The captains recorded these lessons faithfully in their logbooks, but they were then forgotten. At the Observatory, Maury uncovered an enormous collection of thousands of old ships' logs and charts in storage in trunks dating back to the start of the US Navy. Maury pored over these documents to collect information on winds, calms, and currents for all seas in all seasons. His dream was to put this information in the hands of all captains.

Maury's work on Ocean currents led him to advocate his theory of the Northwest Passage, as well as the hypothesis that an area in the Ocean near the North Pole is occasionally free of ice. The reasoning behind this was sound. Logs of old whaler ships indicated the designs and markings of harpoons. Harpoons found in captured whales in the Atlantic had been shot by ships in the Pacific and vice versa, and this occurred with a frequency that would have been impossible had the whales traveled around Cape Horn.

Page 8: Project Infinity Research Report_052310

Maury, knowing a whale to be a mammal, theorized that a northern passage between the oceans that was free of ice must exist to enable the whales to surface and breathe.

In 1843, Lieutenant Maury first published his Wind and Current Chart of the North Atlantic, which showed sailors how to use the ocean's currents and winds to their advantage and drastically reduced the length of Ocean voyages. His Sailing Directions and Physical Geography of the Seas and Its Meteorology remain a standard today.

Maury's uniform system of recording synoptic oceanographic data was adopted by navies and merchant marines around the world and was used to develop charts for all the major trade routes.

Maury advocated much in the way of naval reform, including a school for the Navy that would rival the army's West Point. His dream was finally fulfilled dream of the creation of the United States Naval Academy.

Maury also advocated an international sea and land weather service. He soon became convinced that adequate scientific knowledge of the sea could be obtained only through international cooperation. He proposed that the United States invite the maritime nations of the world to a conference to establish a “universal system” of meteorology, and he was the leading spirit of that pioneer scientific conference when it met in Brussels in 1853.

Within a few years, nations owning three fourths of the shipping of the world were sending their oceanographic observations to Maury at the Naval Observatory, where the information was evaluated and the results given worldwide distribution.

Maury was sent by the United States as advocator of his sea data collecting ideas but not for land. Still, as a result of the Brussels conference a large number of nations, including many traditional enemies, agreed to cooperate in the sharing of land and sea weather data using uniform standards.

It was soon after the Brussels conference when Prussia, Spain, Sardinia, the free city of Hamburg, the republic of Bremen, Chile, Austria, Brazil, and others all joined the enterprise. The Pope established honorary flags of distinction for the ships of the papal states, which could be awarded only to those vessels which filled out and sent to Maury in Washington D.C. the Maury abstract logs.

The Brussels conference was truly historic; never before had there been a US-inspired meeting that so successfully achieved an understanding among the leading nations of the Western world. In Europe and America the press hailed the success of the conference as monumental diplomatic and scientific achievement.

Page 9: Project Infinity Research Report_052310

U.S.N. Matthew Fontaine Maury 1855

“Rarely before has there been such a sublime spectacle presented to the scientific world. All nations agreeing to unite and co-operate in carry out one system of philosophical research with regard to the sea. Though they may be enemies in all else, here they are to be friends. Every ship that navigates the high seas with these charts and blank abstract logs on board, may henceforth be regarded as a floating observatory, a temple of science.”

- Lieutenant Matthew Fontaine Maury of the United States Navy

The Pioneers of Bathymetry

On January 3, 1840, Sir James Clark Ross of the British navy recorded a depth of 2425 fathoms, which was the first reasonably accurate oceanic depth sounding ever recorded.

Page 10: Project Infinity Research Report_052310

Sir James Clark Ross beside him is a dip circle designed by Robert Were Fox, and used by Ross to discover the magnetic south pole.

Ross’s sounding was measured from boats off HMS Terror and HMS Erebus, while Clark was traveling south to explore Antarctica (the value of this sounding was checked by NOAA ship Discoverer in 1968 and found to be 2100 fathoms, only 325 fathoms off).

In October 1845, Lieutenant Charles Henry Davis commanding the Coast Survey brig Washington obtained a measured depth of 1300 fathoms and brought up a specimen of bottom.

Page 11: Project Infinity Research Report_052310

Rear Admiral Charles Henry Davis

Davis had performed the first successful deep sea sounding made by the United States.

Considered the first true bathymetrist, Lieutenant William Rogers Taylor worked relentlessly from December 1850 through February 1852, attempting well over 100 soundings from his sloop of war USS Albany, under the direction of Maury, superintendent of the United States Naval Observatory.

Page 12: Project Infinity Research Report_052310

Rear Admiral Taylor, USN

Maury counted 56 of the 100 soundings as valid and published Taylor’s field journal in his Wind and Current Charts in 1853. Taylor is credited with developing the method of timing descent of the line through hundred-fathom intervals to determine the approximate time and depth of the sounding line reaching the bottom. This method was used for the next 30 years.

The second and last ship that Maury would have control over was the brig USS Dolphin, which he sent into the Atlantic to test sounding methods, search for vigias (reported sightings of rocks, island, breakers, etc., which plagued 19th century sailing charts) and make oceanographic observations.

Page 13: Project Infinity Research Report_052310

Samuel Phillips Lee, United States Navy Rear Admiral

Under the leadership of Lieutenant Samuel Philips Lee, the Dolphin crew perfected a method of sounding from small boats in November 1851, in order to keep the line perpendicular. In two years of work, Lee and his crew acquired 153 soundings.

Next, in 1852, Passed Midshipman J. M. Brooke invented the Brooke sounding machine, which allowed the recovery of bottom specimens to assure that the sounding plummet had in fact reached bottom while using a heavy weight to keep the line taut when running out. Brooke also developed his table of "standard casts" utilizing the time interval and weight of line out and much improved the sounding apparatus.

Page 14: Project Infinity Research Report_052310

The Civil War put an end to the deep-sea work of the United States navy for many years, but it was carried on most successfully by the British, especially by Captain P. Shortland, who improved the Brooke sounding machine, and was one of the first to enunciate the important rule in regard to tension on the line: "A sounding line should not be permitted to run free, but should be resisted by a force equal to the weight in water of a length of the line equal to the depth to be determined."

On July 1st of 1869, a Swedish expedition inadvertently discovered the first seamount approximately 200 miles west-southwest of Cape San Vincent, Portugal. It is known today as Josephine Seamount, named after their corvette Josephine.

The success of the Brooke device and its modifications in bringing up specimens of the bottom and its organisms attracted the attention of naturalists and geologists, and dredging in great depths was attempted. The results of the early work of Count Pourtales, just following the Civil War, under the direction of the United States Coast Survey brought about renewed interest by showing, as Pourtales says, "that animal life exists at great depths in as great an abundance as in shallow water."

Page 15: Project Infinity Research Report_052310

In 1872 the British government outfitted the celebrated HMS Challenger expedition for investigating everything connected with the Ocean depths. Although Sir William Thomson had invented his sounding machine and submitted it to the British Admiralty months before the Challenger was ready, it was rejected for imperfections which might have been easily corrected, and the Challenger sailed with her antiquated outfit of sounding material, whereby time was lost as well as space for supplies and specimens.

The USS Tuscarora under Captain Belknap sailed from San Francisco only four months after the Challenger, but the United States Navy was wise enough to supply her with one or more Thomson machines in addition to the ordinary rope outfit. The new machines were easily brought into working shape by the Tuscarora's officers, and after a few trials entirely superseded the old apparatus. In June 1874, the Tuscarora measured a depth of 4,665 fathoms in the Japan-Kuril Trench, the deepest observed depth in the 19th century.

For the next 50 years (1875-1925), piano wire machines did all deep-sea survey work, during which time between 15-20 thousand soundings greater than 1000 fathoms were taken, mainly driven by the need to determine the location for

Page 16: Project Infinity Research Report_052310

submarine cables; but the findings also laid the early groundwork necessary for modern earth science to eventually discover the theory of plate tectonics.

The Sigsbee Sounding Machine was the primary machine used in the United States. It improved on the Thomson chiefly in having an automatic spring governor to ease the strain on the wire due to the motion of the ship. It was the invention of Lieutenant C. D. Sigsbee, who did much deep-sea, depth, and current study in the Atlantic and Gulf of Mexico commanding the USC&GS steamer Blake.

The Blake was the most innovative oceanographic vessel in the 19th century, the first to define the continental margin. The results of which were used to produce the first 3D model showing the seafloor of the Western Atlantic Ocean in 1884.

Just prior to World War I, piano wire sounding reached its zenith as the German research ship Planet in 1912 sounded a depth of 5352 fathoms in the Philippine Trench.

Sound in Water: the Beginning of Modern Bathymetry

“If you cause your ship to stop, and place the head of a long tube in the water and place the outer extremity to your ear, you will hear ships at a great distance from you.”

Page 17: Project Infinity Research Report_052310

- Leonardo da Vinci, 1490

While man knew that sound travelled efficiently through water for centuries, it was not until the early 19th century that the speed of sound in water was calculated. Francois Sulpice Beudant conducted the first accurate experiment on record in 1816, measuring it at 1500 meters per second. Ten years later, Colladon and Strum calculated it at 1435 meters per second in Lake Geneva.

The first attempt at echo sounding was performed on August 24th, 1838 by Charles Bonnycastle, a professor at the University of Virginia, on the US Coast Survey brig Washington. Crude technology, the shallow depth of the water plus a testing location near the Gulf Stream probably frustrated the effort, producing inaccurate results. It would be over 70 years before echo sounding would get another serious chance.

In 1912, Reginald Fessenden, the inventor first credited with transmitting voice over radio, joined the Submarine Signal Company and began work on his Fessenden oscillator. Due to the sinking of the Titantic, there was much interest in using sound to detect icebergs from a distance. On April 14, 1914, after two years of research, Fessenden successfully bounced sound off both an iceberg at ranges up to 2.5 miles and from the bottom about 1 mile below. The first Ocean mapping by sound had occurred.

Following World War I, echo sounding research resumed. Dr. Harvey C. Hayes, a Navy physicist, invented his Hayes depth sounding device to national acclaim:

“NEW NAVY INVENTION TELLS OCEAN DEPTHS;Device of Dr. Hayes Measures Time of Sound From Ship to Ocean Bed and Back.

ACTION IS INSTANTANEOUSGreat Value to Navigation Is Predicted by Acting Secretary Roosevelt.”

- New York Times Headline on July 6th, 1922

In November of 1922, the US Navy conducted a two-ship survey of the California coast from San Diego to San Francisco with the Hayes device. In less than a month, the USS Corry and USS Hull obtained approximately 5000 soundings, thus a 25% increase in the total deep Ocean soundings collected over 50 years using piano wire machines!

From 1925 to 1927, the Germans conducted the first great expedition to fully leverage echo sounding on their survey vessel Meteor. During a series of thirteen transects across the Atlantic, the Meteor took more than 67,000 soundings, thus tripling the total number of existing deep Ocean soundings in a single expedition.

Page 18: Project Infinity Research Report_052310

Prior to World War II, millions of soundings were recorded globally using these new acoustic methods.

Cold War: the Dark Age of Bathymetry

From 1940-1970, most of the sounding data and new technology related to bathymetry was classified by both the US and Russia. One notorious survey in mid-1950s on the survey ship Pioneer led to a critical finding of earth science.

By streaming a towed magnetometer behind the Pioneer, along with the combination of Navy requirements and funding, the USC&GS navigation and bathymetric expertise, and Scripps instrumentation and scientific expertise, the survey discovered magnetic striping on the seafloor. This was a key development leading to the Theory of Plate Tectonics.

In 1961, the Seamap Project began, aiming to become the first worldwide systematic reconnaissance mapping of the seafloor. Equipped with the new Navy Transit satellite navigation system, the Pioneer was first ship able to reliably fix its position during surveying operations when out of the range of Alaskan and Hawaiian LORAN chains.

During the mid-1960s, two great tools of bathymetry were developed. The first was the Deep Tow instrument system, built and operated by Scripps. While originally designed to obtain sea-floor slope information in the deep sea, ultimately, it evolved into a system with multiple sensors for characterizing the deep-sea environment including a downward-looking, narrow-beam sounding system, a side scan sonar system, television and still-camera systems, and a variety of other sensors and sampling devices.

The second great advancement was the development of the multi-beam sounding system, the first of which was installed on the US Navy Ship Compass Island in 1963. Multi-beam sounding systems obtain depths over a swath of bottom perpendicular to the heading of the survey ship, as well as directly below the ship (as in single-beam sounding systems). While multi-beam systems were a huge jump forward for bathymetry, very little of the technology’s newly acquired data was publicly released. Today, multi-beam systems coupled with accurate navigation are the standard for bathymetric mapping efforts.

Finally, the 1960s saw the decent of the manned research submersibles. First, the US Navy acquired the Trieste, a bathyscaph engineered by the Swiss family Piccard; and on January 23rd, 1960, dove it the bottom of Challenger Deep at 35,814 ft with Lieutenant Don Walsh and Jacques Piccard, taking man for the first and only time to date down to the deepest spot in the Ocean. Next, also funded by the Navy, the Woods Hole Oceanographic Institution launched Alvin in 1964, now famous for its career of unparalleled oceanic research and exploration. Only two years after its

Page 19: Project Infinity Research Report_052310

launch in 1964, Alvin paid for itself and became famous, as it found a hydrogen bomb lost off the coast of Spain in US Air Force collision.

NOAA’s Ark and Declassification of Advanced Bathymetry

Created in 1970, the National Oceanic and Atmospheric Administration (NOAA) is a scientific agency within the US Department of Commerce formed from a conglomeration of three existing agencies that were among the oldest in the Federal Government: United States Coast and Geodetic Survey, formed in 1807; the Weather Bureau, formed in 1870; and the Bureau of Commercial Fisheries, formed in 1871. In addition to its civilian employees, NOAA research and operations are supported by 300 men and women who make up the NOAA Commissioned Officer Corps.

While the US Survey of the Coast was established by President Thomas Jefferson, NOAA was created by President Nixon "...for better protection of life and property from natural hazards...for a better understanding of the total environment...[and] for exploration and development leading to the intelligent use of our marine resources...” In 1971, Congress funded NOAA activities with $300 million; and by 1981, NOAA funding had increased to nearly $1 billion. Bathymetry was finally out of the military closet, back into the hands of the scientists and open to direct commercial development.

Furthermore, the first multi-beam sonar technology finally became commercially available in 1977. Now known as the Seabeam Classic, this 16-beam system produced a 45-degree swath and was installed on numerous research vessels throughout the 1980s. In 1989, Atlas Electronics installed a second-generation deep-sea multi-beam called Hydrosweep DS on the German research vessel Meteor. The Hydrosweep produced up to 59 beams across a 90-degree swath which was a vast improvement and was inherently ice-strengthened.

After Korean Air Lines Flight 007, carrying 269 people, was shot down in 1983 after straying into the USSR's prohibited airspace, President Ronald Reagan issued a directive making GPS freely available for civilian use, once it was sufficiently developed, as a common good. The first satellite was launched in 1989, and the 24th and last satellite was launched in 1994. The Navstar-GPS system became fully operational in April 1995.

Initially, the highest quality signal was reserved for military use, and the signal available for civilian use intentionally degraded, called Selective Availability. This changed in 2000, with U.S. President Bill Clinton ordering Selective Availability turned off at midnight May 1, 2000, improving the precision of civilian GPS from about 1000 feet to about 65 feet. Further advances with differential GPS increased accuracy down to 9 feet. Finally, kinematic GPS referenced the World Geodetic System, thus removing the need to collect a local mean water level measurement, improving overall vertical accuracy.

Page 20: Project Infinity Research Report_052310

21st Century Bathymetry: The Age of SAR, LIDAR, WWW and GIS

Today, with multi-beam sonar systems now sporting hundreds of beams, double pinging and swaths angles up to 150 degrees, high resolution and accuracy are the name of the game. However, given the enormous volume of data generated by such equipment, geographic information systems (GIS) are required for the processing and modeling of large datasets. GIS leverage both traditional enterprise relational database as well as spatial database engines to organize, visualize and analyze such bathymetric datasets.

Distributed GIS can access diverse datasets across the Internet using web services allowing processing to occur at the client and/or the server. As millions of miles of multi-beam survey data in a variety of formats, platforms and knowledge domains have been collected in the last few decades, the challenge is either to migrate from local storages to a publicly available Internet cloud or grid storage, or categorize and link to local online stores with appropriate metadata labeled for automated remote processing.

The current lack of any organized standards-based approach to sharing data and products must be addressed to unlock the distributed value of these vast resources of survey data. Both the data components and functions, or methods, should be distributed across the Internet as published models in order to truly unlock the hidden reservoirs of information already available, but today just short of impossible to access for the layman.

A truly standards-based open-object architecture would allow the assembling of geo-processing components distributed across the Internet, improving the performance and efficiency of client-server communication and effectively opening the entire geo-processing framework to all people and organizations via the World-wide Web. Such an online framework would truly move humanity much closer to seeing and understanding planet Ocean as well as accelerate and pinpoint the areas where further data collection is required.

Page 21: Project Infinity Research Report_052310

Why Map the Ocean?

When you start to examine all the possible benefits to having a complete map of the Ocean floor, it seems extraordinary that we have not already completed this task.

But when you consider that not until recently was all the technology readily available which is needed to map the seafloor efficiently, accurately and in a method that is easy to share; it is not a surprise to find that only the world superpowers, namely the US and Russia, had the will and means to map the Ocean comprehensively for purpose of war, and thus were not willing to share their data publicly.

Today, the needed technology is commercially available and the many industrial and scientific benefits gained by producing a full Ocean map are both abundant and critical. The overwhelming need for a map becomes plainly obviously when you review a brief overview of the benefits to humanity’s production of food, energy, minerals as well as to progress in science, technology and maritime safety. Maps are key to understanding.

Lost Booty and Treasures of History

Only 3 years ago in 2007, secretive U.S. underwater treasure hunt company Odyssey Marine Exploration found a 17th century shipwreck off the coast of Britain with over 17 tons of gold and silver coins, estimated at the time to be worth nearly $500 million at the time of discovery.

Page 22: Project Infinity Research Report_052310

As the wreck was located in international waters, the company stands to take nearly 90% of its find. This same company has reportedly found the British warship HMS Sussex, which sank in the Straits of Gibraltar in 1694, carrying ten tons of gold bullion ($400 million in 2010 value.) How many more treasure ships does the Ocean hold for us?

As thousand of tons of gold and silver were moved across the Ocean in the last 500 years, hundreds of tons have been consumed by the Ocean and lay down on its floor in shipwrecks. We now have the technology to both find and recover the riches from Davy’s Jones locker, which are estimated to be in tens of billons of dollars in value.

Furthermore, with robotic underwater vehicles (RUV) careful recovery can be done that also spares the find for archeological study. As the deep Ocean can be a great preserver, we can learn much about human history from such shipwrecks.

Goldwater to Diamonds in the Rough

The Ocean contains on average 40 pounds of gold in every cubic mile of seawater. The English chemist, S. Sonstadt, was the first to definitely establish its presence in 1872. Even at the conservative estimates of 10 ppt of gold in seawater, there is a great deal of gold in solution in the Ocean.

Page 23: Project Infinity Research Report_052310

In all of human history, we have unearthed an estimated total of 3.3 billion ounces of gold, an amount equivalent to a cube of gold 55 feet on a side. The Ocean contains about 25 billion ounces of gold in its water alone. Scientists have been theorizing for over a century how to extract it.

In 1925, the famous German chemist, Fritz Haber, who had kept his country fighting in WWI by extracting nitrogen from the atmosphere for its bombs and bullets, was sent on a secret mission to extract gold from the sea to repay the Allies reparations demand of 50,000 tons of gold. While Haber’s mission was a complete failure, more recently, work has progressed on biotech solutions using bacteria to mine the gold from the water.

By the mid-20th century, the search for minerals extended to the rocky seafloor and below the Earth’s crust. Initially, funded by the US government via the National Science Foundation, Project Mohole was an ambitious and successful effort to drill through the seafloor and into the Earth’s mantle.

In 1961, five holes were drilled off the Pacific coast of Mexico, the deepest at 601 ft below the sea floor in 11,700 ft of water. This was unprecedented: not in the hole's depth but because of the depth of the water and because it was drilled from an untethered platform. Hence Project Mohole virtually invented what is now known as dynamic positioning, as well as discovered that the Ocean crust was composed of basalt, ending decades of debate.

A couple years later, DeBeers, the global diamond cartel, hired Willard Bascom, the director of Project Mohole, to design and operate a ship to mine diamonds from the seafloor. By the end of 1964, Bascom’s ship, The Rockeater had recovered 129 diamonds from diamondiferous gravel on the Ocean floor. His survey team estimated that the concession held more than ten million carats, or two tons of diamonds.

Black Pearls, not EEZ Money

First discovered by the HMS Challenger in the late 19th century, the seafloor is covered in many spots by nodules or irregular balls of manganese and other metals. The potato-like lumps formed like pearls from dissolved metals in seawater precipitated over millions of years. From numerous ships dredging in the 1950s and 60s, it was concluded that trillions of nodules lay on the seafloor. Furthermore, despite the fact the black nodules were ugly and slimy, they are treasures of rare metals—up to 25% manganese, 2% cobalt, 2% copper, 2% nickel and dozens of uncommon elements.

Many of the densest sites of nodules happen to lay off the shores of the US. The greatest concentrations with the highest metal contents were discovered just southeast of Hawaii, with thousands of square miles of seafloor densely covered with nodules like cobblestones. The unclaimed nodules are in theory worth trillions of dollars, assuming they can be collected efficiently without destroying the deep Ocean ecosystem.

Page 24: Project Infinity Research Report_052310

By the 1970s, numerous global consortia were formed, preparing to mine the nodules. The United Nations reacted by drafting the first version of the Law of the Sea Treaty, ostensibly to protect the poor nations’ claims to seabed resources. Eventually, it was concluded that the nodules were a potential treasure, but only if metal prices increased and the technology advanced sufficiently to make the deep-water mining more efficient.

The eventual outcome of the whole “black pearl” frenzy was the UN Law of the Sea Treaty and its new rule allowing a nation to claim a large offshore area known as an Exclusive Economic Zone, or EEZ, which extended out for 200 nautical miles from its coast. It just so happens that this rule benefits the USA with the largest EEZ on the planet, effectively more than doubling the size of our domain, adding more than four million square miles of Ocean to our country. The EEZ around Hawaii is nearly one million squares by itself.

Copper Chimneys and Volcanoes of Gold

In late 1970s, the first hot chimneys, or hydrothermal vents, were discovered along the Galapagos Rift. By 1981, a team of NOAA researchers exploring the area discovered twenty extinct chimneys on top of the largest known mass of seafloor metallic sulfides (SMS).

The ore deposit was estimated to be 130 feet thick, 650 feet wide and 3280 feet long. A sample core sample from one of the chimneys was composed of sulfur, iron, copper, silica, zinc, manganese, aluminum, selenium, cobalt, magnesium, molybdenum, lead, arsenic, barium, cadmium, chromium, phosphorus, mercury, nickel, tin, vanadium, uranium, tungsten, and silver. The copper alone from the deposit was estimated to exceed $2 billion in value.

By the mid 1980s, more SMS deposits were found, with concentrations of copper and silver higher than any described before. Finally, gold was discovered on the Juan de Fuca Ridge off Oregon, in the Mid-Atlantic Ridge’s volcanic center, and off the New Guinea archipelago. At forty-three parts per million, the highest concentrations of gold were four thousand times greater than the average in the Earth’s crust and ten times higher than what was routinely mined on land!

In last decade, mineral exploration companies, driven by the elevated price activity in the base metals sector, have turned their attention to extraction of mineral resources from hydrothermal fields on the seafloor. Two companies are currently engaged in the late stages of commencing to mine seafloor massive sulfides. Nautilus Minerals is in the advanced stages of commencing extraction from its Solwarra deposit, in the Bismarck Archipelago, and Neptune Minerals is at an earlier stage with its Rumble II West deposit.

Both companies are proposing using modified existing technology. Nautilus Minerals, in partnership with Placer Dome (now part of Barrick Gold), succeeded in 2006 in returning over 10 tons of mined SMS to the surface using modified drum cutters mounted on an

Page 25: Project Infinity Research Report_052310

ROV - a world first. Neptune Minerals in 2007 succeeded in recovering SMS sediment samples using a modified oil industry suction pump mounted on an ROV - also a first.

Treasures of Deep Life

One of the most exciting, and surprising, treasures of the deep Ocean discovered have been heat-loving microbes. The mining of life from the volcanic deeps has been critical in the advances of biotechnology in the last few decades. By weight, these single-cell organisms are worth far more than gold.

In the volcanic domain of the hydrothermal vents where massive pressures allow no boiling, water gets superheated. Without the violence of boiling, many exotic microorganisms have adapted to survive the great heats. In fact, hydrothermal vent zones have a density of organisms up to 100,000 times greater than the surrounding seafloor.

Hydrothermal vent communities are able to sustain such vast amounts of life because vent organisms depend on chemosynthetic bacteria for food. The water that comes out of the hydrothermal vent is rich in dissolved minerals and supports a large population of chemo-autotrophic bacteria. These bacteria use sulfur compounds, particularly hydrogen sulfide, a chemical highly toxic to most known organisms, to produce organic material through the process of chemosynthesis.

This ecosystem is reliant upon the continued existence of the hydrothermal vent field as the primary source of energy, which differs from most surface life on Earth, which is based on solar energy. Microbes able to flourish and thrive at these very high temperatures, which break all the previously known rules of life, turned out to be a treasure trove for genetic engineering.

The main allure of such microbes is their enzymes, which like the creatures themselves, work at extraordinarily high temperatures and thus can survive the high heats of biochemistry. In 2008, the global enzyme market was estimated at $4.2 billion with a significant annual growth rate of 7-10%.

Additionally, marine microbes have a huge impact on the planet's climate and ecosystems by trapping carbon from the atmosphere. New research has shown marine bugs are far more abundant than thought and make up 50 to 90 per cent of all the biological material, or "biomass", in the Ocean. Their combined weight matches that of 240 billion African elephants!

The true extent of the marine microbe population is only now being revealed by the Census of Marine Life, a global project to survey life in the Ocean involving over 2,000 scientists in 80 nations. In the 1950s, it was estimated that about 100,000 microbial cells inhabited a liter of seawater. Today, the same volume is believed to contain more than a billion microorganisms. A gram of seafloor mud holds about the same number.

Page 26: Project Infinity Research Report_052310

New uses for marine microbes are found everyday. Only in the couple weeks of May 2010, two companies have found new uses for marine extracts. Firstly, Aquapharm, a Scottish marine biotechnology company, found a natural cure for dandruff in microbes in the sea off Scotland’s coast and is poised to strike a major deal with a European company that could add around £1m a year to the fledgling firm’s revenues. Aquapharm, which is based near Oban, is developing a library of potential new drugs, treatments and antibiotics from the microbes found in marine environments.

Secondly, in a breakthrough that offers new hope for the containment of influenza outbreaks, an Australian biotechnology company has isolated a natural extract from seaweed, which has been shown to inhibit the H1N1 virus or Swine flu. The extract is derived from the Undaria pinnatifida species of seaweed. Developed by biotechnology company Marinova Pty Ltd, the extract has an immediate market potential in nutritional supplements, hand washes and nasal delivery products, which target the spread and prevention of viral conditions. Scope also exists for the compound to be included in pharmaceutical and medical device applications. 

The advances to come in the 21st century from marine microbes are truly unimaginable. Whether it is a cure for cancer, AIDS or future catastrophic plagues, or solutions for sustainable food supplies for our growing human population, the Ocean and its microorganisms have a seemingly limitless bounty to offer us.

Page 27: Project Infinity Research Report_052310

Not Bottomless Oceanfood

Seafood is perhaps the largest international commodity with fish trade exceeding US$60 billion per year. Almost 200 countries supply fish and seafood products to the global marketplace consisting of more than 800 commercially important species of fish, crustaceans and mollusks.

Through the current day, the supply of fish has kept up not only with a rapidly increasing population but also with increases in per capita consumption. However, global capture fisheries are at their maximum sustainable yield and while aquaculture continues to grow, it will have difficultly keeping pace with global demand.

Between 1960 and 2003, the world’s population rose from 3 billion to 6.3 billion representing an increase of 110% and an annual rate of growth of 1.7 percent. This rate of increase is unprecedented and has posed major challenges to food producers. By 2025, the global population is expected to grow to 8.5 billion, a further increase of 35 percent.

In 2003, fish accounted for approximately 16 percent of the animal protein consumed worldwide and in some Asian countries the proportion ranges as high as 30 percent to 50 percent. For about one billion people, seafood is the primary source of animal protein.

Total fish catch has increased from approximately 38 million tons in 1960 to 137 million tons in 2003 – an increase of 260 percent representing an annual rate of growth of 3.0 percent. During this time period, fish has also emerged as one of the largest export commodities in the world with 2003 exports estimated at US$63.5 billion. This dwarfs global trade in commodities such as coffee, cocoa, rubber, sugar, tea, tobacco or rice.

A major international scientific study released in November 2006 in the journal Science found that about one-third of all fishing stocks worldwide have collapsed (with a collapse being defined as a decline to less than 10% of their maximum observed abundance), and that if current trends continue all fish stocks worldwide will collapse within fifty years.

In July 2009, Boris Worm of Dalhousie University, the author of the November 2006 study in Science, co-authored an update on the state of the world's fisheries with one of the original study's critics, Ray Hilborn of the University of Washington at Seattle. The new study found that through good fisheries management techniques even depleted fish stocks could be revived and made commercially viable again.

Recent research on rockfish shows that large, elderly females are far more important than younger fish in maintaining productive fisheries. The larvae produced by these older maternal fish grow faster, survive starvation better, and are much more likely to survive than the offspring of younger fish. Failure to account for the role of older fish in maintaining healthy populations may help explain recent collapses of some major US West Coast fisheries. Recovery of some stocks is expected to take decades. One way to prevent such collapses may be to establish marine reserves, where fishing is not allowed and fish populations age naturally.

Page 28: Project Infinity Research Report_052310

Fisheries management decisions are often based on population models, however those models need quality data to be effective. It's that caliber and volume of data that is lacking in fisheries science, according to Milo Adkison, an associate professor in the School of Fisheries and Ocean Sciences at the University of Alaska Fairbanks.

"Many fisheries scientists spend a lot of time and effort doing complicated analyses using complex models of their data," said Adkison. "This effort might be better spent collecting more and better data. The primary limitation in fisheries management decisions is the absence of quality data. Scientists and fishery managers would be better served with simpler modeling analyses and improved data. "

Better data starts with an accurate map of the ecosystem; first with a high resolution seafloor map followed by backscatter and water column data. Such data is used in GIS systems like Ecopath, an ecosystem modeling software suite, which is widely used in fisheries management as a tool for visualizing and simulating the complex relationships that exist in real world marine ecosystems. The software, initially a NOAA initiative, was named in 2007 as one of the ten biggest scientific breakthroughs in NOAA’s 200-year history. Now if we could just get some quality data loaded into the system.

Additionally, a complete Ocean map would increase the efficiency of the seafood industry as a whole. Helping fisherman catch fish quicker by giving them a better understanding of their prey’s habit benefits everyone, even the fish if the quotas and marine reserves are set more accurately based on quality information.

Limitless Clean Ocean Power

For anyone who has ever spent time near or on the Ocean, the one thing that is a universal marvel and wonder is the immense power welded by Mother Ocean. Her waves, tidal currents, winds and storms are eternally powerful. Yet despite all the Ocean power, man has only recently began exploring how to harness and convert it into energy.

Water energy already provides nearly a fifth of the world's power, mostly in the form of hydropower from dams, but so far ocean-power generation remains mostly undeveloped around the globe. While profits remain elusive for now, the untapped force of the Ocean is creating a swell of support from green-energy proponents from both the public and private sector.

The rewards could be vast as the Ocean. By one estimate put out by the World Energy Council, tapping power from tides and waves could theoretically double the world's current output of electricity -- offering a nearly boundless supply of emission-free energy. The wave industry alone could reach 1.2 gigawatts by 2020, enough to power about 750,000 homes, and could reach 5 gigawatts by 2025, according to estimates from HIS Emerging Energy.

Page 29: Project Infinity Research Report_052310

Ocean Power Technologies (OPT) is a publicly-traded renewable energy company specializing in cost-effective and environmentally-sound offshore wave power.

In December 2009, Ocean Power Technologies, Inc. deployed one of its PowerBuoys at the US Marine Corps Base Hawaii (MCBH) at Kaneohe Bay. The Oahu PowerBuoy was launched under the Company's ongoing program with the US Navy. The system extracts the natural energy in Ocean waves, and is based on the integration of patented technologies in hydrodynamics, electronics, energy conversion and computer control systems. The PowerBuoy is a “smart” system capable of responding to differing wave conditions.

OpenHydro is a technology business that designs and manufactures marine turbines to generate renewable energy from tidal streams. The company's vision is to deploy farms of tidal turbines under the world's oceans - silently and invisibly generating electricity at no cost to the environment.

The Open-Centre Turbine is designed to be deployed directly on the seabed.

Page 30: Project Infinity Research Report_052310

Aquamarine Power, a wave energy company is developing an innovative hydroelectric wave energy converter, known as Oyster. 

The first demonstration-scale Oyster 1 was successfully deployed at sea at the European Marine Energy Centre (EMEC) in Orkney, Scotland in November 2009 when it began producing power to the National Grid. The new Oyster 2 (see above) is 800kW device that will measure 26 meters by 16 meters and will deliver 250 per cent more power than the original Oyster 1.

SDE, an Israeli firm, is currently finalizing the construction of the first of many wave power plants to be installed in China. The first 1MW wave power plant will be installed in the province of Guangzhou, in the city of Dong Ping, and will represent the successful beginning of the implementation of an overall 10,000MW plan to be executed in Guangzhou in the near future.

China has a great need for the swift implementation of SDE's wave power technology due to its high rates of pollution. Apparently, only one percent of the country's 560 million city dwellers breathe air considered safe by the European Union. According to the Ministry of Health, the pollution in China had made cancer the country's leading cause of death. Ambient air pollution alone is blamed for hundreds of thousands of deaths each year and nearly 500 million people lack access to safe drinking water.

Page 31: Project Infinity Research Report_052310

The erection cost of a 1MW SDE's power station starts from US$650,000 while a comparable coal station costs US$1,500,000, a natural gas station US$900,000 a solar station US$3,000,000 from solar sources, and a wind station from $1,500,000. In addition, SDE production cost per KW is only US2 cents, compared to US3 cents from coal, US3.5 cents from natural gas, US12 cents from solar energy, and US3.6 cents from wind. Supposedly, the system that SDE has developed is scalable to supply 500 times more than the electricity requirements of the whole world population! We’ll see.

Marine Current Turbines Ltd is a leader and first mover in the development of new technology for exploiting tidal currents for large-scale power generation. Its full-scale commercial system known as SeaGen was installed in May 2008 in Northern Ireland and is the world leading marine current and tidal stream technology that is deployed.

Page 32: Project Infinity Research Report_052310

 

SeaGen is by far the largest and most powerful tidal turbine in the world with twin rotors each sweeping over 200 square meters of flow.  It also uses the most efficient type of turbine rotors, namely axial flow pitch controlled rotors, the technology of choice in the wind industry, generating a rated power output of 1.2 MW at a current velocity of 2.4m/s.

Pelamis Wave Power Ltd is the manufacturer of a unique system to generate renewable electricity from Ocean waves. The Pelamis Wave Energy Converter is a semi-submerged, articulated structure composed of cylindrical sections linked by hinged joints. The wave-induced motion of these joints is resisted by hydraulic rams, which pump high-pressure fluid through hydraulic motors via smoothing accumulators. The hydraulic motors drive electrical generators to produce electricity.

Power from all the joints is fed down a single umbilical cable to a junction on the seafloor. Several devices can be connected together and linked to shore through a single seabed cable. Current production machines are 180m long and 4m in diameter with 4 power conversion modules per machine. Each machine is rated at 750kW.

Page 33: Project Infinity Research Report_052310

Alpha ventus is the first open sea wind farm of its kind in Germany’s territorial waters. It marks the beginning of a new era in environmentally friendly power generation in Germany – far from the coast in the open sea, in deep-sea conditions and utilizing cutting-edge technology. Utilizing twelve wind turbines, this new wind farm has a total capacity of 60 megawatts. At full operation the amount of future energy produced annually by alpha ventus will meet the consumption requirements of 50,000 households.

With US government approval finally secured, Cape Wind is now building America’s first offshore wind farm on Horseshoe Shoal in Nantucket Sound. Miles from the nearest shore, 130 wind turbines will gracefully harness the wind to produce up to 420 megawatts of clean, renewable energy.  In average winds, Cape Wind will provide three quarters of the Cape and Islands electricity needs. Construction is scheduled to be completed in 2013.

In April 2010, the U.S. Department of Energy has given Lockheed Martin two grants totaling $1 million to begin to look at ways to generate electricity from the clash that occurs when chilly water from 2,000 feet below the Ocean surface comes in contact with surface water. Ocean Thermal Energy Conversion, or OTEC, could in theory deliver 3-5 terawatts (TW). Five TWs would be about 30% of global energy consumption.

Lockheed Martin is looking to install a pilot plant in the range of 5-10 MW to demonstrate the integration of the complete system and to help understand the economics, the risks and the total performance. But first, Lockheed Martin will invest the initial grant funds to develop a tool to estimate how much energy can be extracted from the ocean's thermal layers.

Page 34: Project Infinity Research Report_052310

A GIS-based dataset and software tool will be developed to allow users to estimate the potential power from a region of the ocean. The resulting resource mapping will provide critical information to policymakers, the energy industry and the public about regional OTEC feasibility.

In cooperation with two universities -- Florida Atlantic University and the University of Hawaii -- Lockheed Martin will develop the GIS tool that can define the better spots of the Ocean and return to users an estimate of the megawatts that can be produced in that area. This information would let decision-makers evaluate the future of OTEC in that location. Estimating power generation from local Ocean regions would help people appreciate and realize the size of the resource and its value to the local community. Also, mapping the sea would help to qualify the resource, beyond what we know today.

All Ocean-power devices are either anchored to the seafloor, suspended about it, or have cables that span large stretches of Ocean floor for the transmission of energy. The use of bathymetry and GIS to clearly depict the seafloor, determine accurate depths, and collect substrate composition data will lead to more capable decision support for engineering and design work.

Accurate information and maps are essential for locating infrastructure, transmission cables, and maintenance facilities. Ocean power companies are responsible for planning, building, and operating their technologies in ways that do not harm marine ecosystems. The clear representation of the seafloor and more detailed and precise ways to depict it are required to make it possible for Ocean power and other related industries to develop and thrive.

Page 35: Project Infinity Research Report_052310

Big Oil & Bigger Oil

The six largest, non state-owned energy companies, commonly referred to as either the supermajors or Big Oil, had combined revenues of about $1.6 Trillion USD in 2008.

The most interesting point about this huge revenue number and consolidation of wealth, is that the supermajors only control about 5% of global oil and gas reserves. Conversely, 95% of global oil and gas reserves are controlled by state-owned oil companies, primarily located in the Middle East.

The world’s biggest company is controlled by Saudi Arabia and is not listed on a stock exchange, according to new research by the Financial Times and the management consultancy McKinsey. Saudi Aramco, the state company in charge of the country’s vast oil reserves, is worth an estimated $781 Billion, dwarfing the $454 Billion market capitalization of its rival ExxonMobil.

Page 36: Project Infinity Research Report_052310

Saudi Aramco owns both the largest conventional and offshore oil fields on the planet. Ghawar is by far the largest conventional oil field in the world, measuring 170 by 19 miles. Currently, it is producing 5 million barrels per day. It supposedly has more than 71 billion barrels of oil and 110,000 billion cubic feet of natural gas in proven reserves remaining.

Safaniya, the largest offshore oil field in the world, is also operated and owned by Saudi Aramco. It is located 165 miles north of the company headquarters in Dhahran in the Persian Gulf, Saudi Arabia. Measuring 31 by 9.3 miles with a producing capability of more than 1.2 million barrels per day. Its reserve amounts to around 37 billion barrels of oil and 5,360 billion cubic feet of natural gas.

To put Saudi Aramco’s Safaniya offshore field in perspective: if all areas offshore of the contiguous US were opened up, they might yield some 18 billion barrels of crude and 77,000 billion cubic feet of gas spread over decades, which is only equivalent to three years of US consumption.

So, YES, our country is, and will always be, foreign oil dependant, as long as we are dependant on it. It is quite ironic that we are allowing foreign oil companies, like BP, to experiment with deep drilling in our precious Gulf of Mexico, when the Saudis have five times more oil in production in just two fields than we have in all our untapped, previously unspoiled, offshore Ocean resources.

Even more ironic that it was our technology prowess via Standard Oil of California’s deal with the Saudis back in the 1930s, which allowed the discovery and exploitation of their massive reserves. One would think that we would now leverage our technology innovation and American ingenuity to harness Ocean power, which we have in great abundance on both our coasts to free ourselves from the dirty, global-warming addiction to foreign fuel. However, most of the Ocean power startups are UK-based.

Finally, the largest customers of bathymetric survey services are the supermajors, who of course, just like the superpowers in the Cold War, are not sharing any of their Ocean maps publicly, other than with the US Department of Interior’s Mineral Management Service when they apply to do offshore drilling. While most of the survey companies are private regional players, Fugro, a large Dutch public company, does about $1 billion annually of bathymetric survey work, more than 80% of which is for the oil and gas industry.

As a result, Big Oil has the most recent and highest resolution maps of the Ocean’s offshore coastal areas, and has basically been funding the bathymetric industry’s progress for the last few decades, especially since the end of the Cold War. Therefore, any major effort to Map the Ocean should carefully consider the strategic opportunities with relation to Big Oil; the supermajors have the most experience and largest investment in bathymetry to date as well as the most to gain financially in the short-term.

Page 37: Project Infinity Research Report_052310

Early Transoceanic Communication

Not long after the invention of the telegraph in 1839, did the prospect of laying cable on the seafloor begin to occupy the minds of the 19th century innovators. In 1842, Samuel Morse laid a cable across the New York Harbor and successfully telegraphed his Morse code through it.

By 1858, Cyrus West Field convinced British industry leaders, based on Matthew Fontaine Maury’s preliminary bathymetric findings of a plateau between England and Iceland, to fund the first transatlantic cable to the US. While the cable only worked for a month, as the “plateau” was actually the Mid-Atlantic Ridge with a volcanic center running through it, it was enough of a proof of concept to get the submarine industry started.

Subsequent attempts in 1865 and 1866 with the world's largest steamship, the SS Great Eastern, used a more advanced technology and produced the first successful transatlantic cable. The Great Eastern later went on to lay the first cable reaching to India from Aden, Yemen, in 1870. For the next 50 years, the British dominated the new industry, connecting their vast colonial empire with the latest communication technology.

By the beginning of the 20th century, the Pacific had cables linking the US mainland to Hawaii, Guam, the Philippines and Australia.

Page 38: Project Infinity Research Report_052310

Seas of Glass

In the 1980s, when fiber optic cables were invented, it was a big advance in both reliability and bandwidth for the submarine cable industry. The first transatlantic telephone cable to use optical fiber was TAT-8, which went into operation in 1988. A fiber-optic cable comprises multiple pairs of fibers. Each pair has one fiber in each direction. TAT-8 had two operational pairs and one backup pair.

Today submarine fiber optic cables carry 99% percent of international traffic, while the remainder is carried by satellite. The reliability of submarine cables is high, especially when multiple paths are available in the event of a cable break. Also, the total carrying capacity of submarine cables is in the terabits per second while satellites typically offer only megabits per second and display higher latency. However, a typical multi-terabit, transoceanic submarine cable system costs several hundred million dollars to construct.

As a result of the high cost and critical connections provided, these cables are highly valued not only by the firms building and operating them for profit, but also by national governments. For instance, the Australian government considers its submarine cable systems to be “vital to the national economy.” Accordingly, the Australian Communications and Media Authority (ACMA) has created protection zones that restrict activities that could potentially damage cables linking Australia to the rest of the world. The ACMA also regulates all projects to install new submarine cables.

Almost all fiber optic cables until 1997 were constructed by "consortia" of operators. For example, TAT-8 counted 35 participants including most major international carriers at the time such as AT&T. Two privately-financed, non-consortium cables were constructed in the late-1990s, which preceded a massive, speculative rush to construct privately-financed cables that peaked in more than $22 billion worth of investment between 1999 and 2001.

Submarine cables are most often broken by fishing trawlers, anchors, earthquakes, undersea avalanches, and sometimes even shark bites. Based on surveying breaks in the Atlantic Ocean and the Caribbean Sea, it was found that between 1959 and 1996 less than 9% were due to natural events. Thus, human activity is by far the greater source of submarine cable faults. In response to this self-imposed threat to our communications network, the practice of cable burial has developed.

The average incidence of cable faults was 3.7 per 1000 km per year from 1959 to 1979. That rate was reduced to .44 faults per 1000 km per year after 1985 based on widespread burial of cable starting in 1980. Still, cable breaks are by no means a thing of the past, with more than 50 repairs a year in the Atlantic alone, and significant breaks in 2006, 2008 and 2009.

Incidentally, cable breaks are reported by shore stations that can pinpoint the location of the break quite accurately by means of electrical measurements. A repair ship will be sent to this location and drop a marker buoy that designates a likely proximity to the break.

Page 39: Project Infinity Research Report_052310

Optimum Cable Routing

Before any work is started on any transoceanic cable route project, a great deal of time and study is undertaken, including a thorough bathymetric survey that provides important data like depth and the characteristics of the seafloor where cables would be laid, especially as the route approaches the continental shelf

In shallow depths these fiber optic cables can be buried in a trench up to 1 meter deep.  However, as the ocean depths increase they are usually left uncovered. When these cables are laid in extreme depths, generally they are left to sink to the ocean floor. Thus, using sonar to create an accurate route map becomes a necessity both to the firms operating the transoceanic fiber optic cables and to other industries such as fisheries that may be barred from operating near the cable route. 

The use of multi-beam sonar is generally the most efficient for mapping a transoceanic fiber optic cable route because both the topography and bathymetry can be surveyed and recorded along a projected route.  In some areas, the unique backscatter data produced from a side scan sonar is required to plan the best route for a cable, in which case both multi-beam and side scan sonar system would be leveraged together. Additional tools like sub-bottom profilers are used to confirm sediment depths, and magnetometers are sometimes used to determine or confirm the exact location of existing pipelines and cables.

Next to oil and gas, the submarine cable industry is the second largest consumer of bathymetric survey services. Having an accurate seafloor map and understanding of the surface composition is essential to planning the safest and most efficient route for each multi-million dollar cable project. After the project is complete, effective sharing of the cable route map with the proper authorities is equally important in order to lower the future risk of human activity catching or breaking the cable.

If an entire high-resolution map of the Ocean were available, it would certainly be a great value to the submarine cable industry, the oldest consumer of bathymetric survey services. Cable routes never considered before could potentially be found to be more cost effective than previously imagined. Also as cable routes to developing nations in Africa and South America are being considered, finding more efficient routes to these less affluent nations could accelerate such projects.

Future Offshore Developments

In the last few years, technology companies like Google have filed patents to construct offshore datacenters, which would leverage the Ocean’s unclaimed real estate and cold waters to lower operating costs and carbon emissions from their dense server farms. A full Ocean map would help in locating new cable routes that could intersect with ideal potential sites for these future offshore datacenters.

Page 40: Project Infinity Research Report_052310

Such offshore datacenters and new cable routes could also potentially be combined with new concept of creating permanent dwellings at sea, called seasteads. The Seasteading Institute was established to explore setting up permanent, autonomous ocean communities to enable experimentation and innovation with diverse social, political, and legal systems. Again, an Ocean map would be instrumental is finding the ideal locations for such new communities to develop.

What Rock?

One of the most obvious but critical uses of an Ocean map is safe navigation. For efficient trade in and out of ports, nautical charts must be accurate to help ships navigate safely. As trillions of dollars worth of goods are transported across the Ocean annually, safe passage is essential to successful commerce. Despite the electronic charts installed in most commercial vessels today, the Ocean is a continual state of flux and obstructions like sand bars are changing constantly. Furthermore, many chart depths were recorded incorrectly decades or even centuries ago with old technology, and have not been updated recently.

Even the US Navy, which has been mapping the Ocean floor for 160 years, makes mistakes. In fact three nuclear submarines in US waters have collided with seamounts, pinnacle rocks, and even the seafloor itself since 2005, causing hundreds of millions of dollars in damage, killing and injuring crewmembers, and endangering nearby civilians. Among the reasons cited for these accidents include human error and a lack of vital information, namely a complete Ocean floor map. If our submarines’ computer navigational systems had a complete and accurate bathymetry, many such collisions might be avoided.

So if the most advanced navy in the world, with the most complete Ocean map collected, is still running into rocks, underwater mountains and the seafloor itself, one can just imagine the probable billions of dollars in damage and loss of life annually from commercial and recreational vessels that don’t have the equipment or maps that our Navy has. The bottom line is that a complete Ocean map would save billions of dollars for ship damage and lost good plus hundreds of lives each year. Additionally, the secondary effect would be lower insurance rates due to the increased safety and lower risk of operating a boat or ship on the Ocean, which would surely be in the billions over time.

Big Waves & Perfect Storms

While the power of the Ocean is always awe-inspiring and magnificent, it can quickly turn terrifying and catastrophic. Storm surges from hurricanes or cyclones, and even more surprising, tsunamis generated from undersea earthquakes can decimate entire communities, killing millions of people and wreaking billions of dollars of destruction on

Page 41: Project Infinity Research Report_052310

buildings and infrastructure. While man cannot stop such natural disasters, we can better plan for them and develop more timely and accurate warning systems. A complete and accurate Ocean map allows the creation of high resolution, integrated bathymetric—topographic digital elevation models (DEMs)—computer representations of Earth’s surface—that are suitable for tsunami and storm surge modeling, forecasting, and warning, which will both save lives and money. Specifically, tsunami travel times across the Ocean basins are directly related to seafloor depth, thus having a complete high resolution Ocean map is extremely useful for effective tsunami modeling and forecasting. Since you never know where the earthquake or disturbance is going to strike, partial maps can enter large unknowns into the DEM.

Long term coastal planning based on an accurate DEM can save thousands or even millions of lives, depending on the scale of the wave or surge and the location it will hit; not to mention billions of dollars. An accurate DEM and a tsunami warning system would have saved a large portion of the nearly 230,000 people that were killed in the tsunami on Dec.26th, 2004 that was generated from the earthquake off Sumatra. Despite a lag of up to several hours between the earthquake and the impact of the tsunami, nearly all of the victims were taken completely by surprise!

Hurricane Katrina in 2005 was the largest natural disaster in the history of the United States. Total damage was $81 billion and 1836 deaths, most of which was a direct result of the storm surge. Much of the death and destruction could have been avoided with better planning based on accurate DEMs for the storm surge. The levees should have been designed for such a surge, as hurricanes are an annual occurrence in the Gulf of Mexico. A high resolution seafloor map of the Gulf may have allowed city planners to predict more accurately the danger and fix the levees years before Katrina, potentially saving hundreds of lives and tens of billions of dollars.

Get it now! FREE Dumping for Toxic and Radioactive Waste

That’s right, folks! For just the cost of a boat and gas, you too can dump unlimited toxic and radioactive waste directly into Mother Ocean! And you don’t have to travel thousands or even hundreds of miles, wasting your precious fuel or time; we have a wonderful dump only thirty miles west of San Francisco, right amid some of the most productive fisheries on the West Coast. So you can catch a few salmon or halibut on your way home, thus providing a great cover story to your liberal, save-the-whales’ type friends at the fresh fish BBQ that evening…

Yes, believe it on not, starting in 1946, our wonderful United States of America, started dumping thousands of barrels of radioactive waste next to the Farallones Islands, an area rich in wildlife similar to Monterey Bay, home to thousands of sea lions, seals and birds in addition to the surrounding waters teaming with whales, porpoises, snapper, sole, salmon, stripped bass, halibut, rockfish, sablefish, and herring, as well as such bottom dwellers like sea urchins, octopi, and abalone.

Page 42: Project Infinity Research Report_052310

Until 1970, the Ocean surrounding the Farallones was used as a nuclear dumping site for radioactive waste under the authority of the Atomic Energy Commission at a site known as the Farallon Island Nuclear Waste Dump. Most of the dumping took place before 1960, and all dumping of radioactive wastes by the United States was terminated in 1970. By then, 47,500 fifty-five gallon steel drum containers had been dumped in the vicinity, with a total estimated radioactive activity of 14,500 curies.

The wastes dumped at the Farallones included thorium, uranium, plutonium, cesium, strontium, and all laboratory materials that were contaminated with radioactivity. Plutonium-239, the main fuel for nuclear weapons, has a half-life of 23,400 years and is so deadly that a tiny speck not visible to the eye causes cancer.

These steel drums were not the modern waste disposal models, designed to last hundreds of years and withstand extreme pressure, but rather your good old-fashioned oil drums. And in the rush of dumping process, as everyone was motivated to get to the fishing part of the trip, often the drums were not filled completely so they floated when dumped. So the US Navy would take immediate and forceful action to prevent them from floating off, and simply shot them full of holes to ensure that they sunk quickly.

While the Farallon site was largest US undersea dump site, and the first, it was not the last. The hottest spot was 150 miles of the coast of Delaware, where 14,300 barrels of radioactive waste as well as a nuclear reactor shell from the Seawolf submarine. America, the leader of the free world and the Atomic Age, establish 50 such sites throughout the Atlantic and Pacific. Our leadership in dumping provided a role model for the rest of the nuclear community, starting a global trend of wantonly dumping the most dangerous man-made wastes in the Ocean, which lasted until the 1990s.

While the US stopped such dumping practices in the early 1970s, it was not until 1983 that the nuclear community put a final moratorium on radioactive dumping into the Ocean. In the early 1990s, a proposal strongly backed by Greenpeace finally was taken up by the UN to account for all the dumping. The UN International Atomic Energy Agency published a report in March 1991 that revealed that the reported 73 Ocean dumps contained nearly 1.24 million curies of radiation.

While the USSR was a signer of the moratorium and party to all these talks, it maintained that it had never engaged in Ocean dumping of radioactive material. With Siberia and all that empty land, everyone must have figured that they just buried it on land. However, when the USSR collapsed, this façade of innocence quickly fell too.

In October 1992, Russian president, Boris Yeltsin ordered an investigation into Soviet dumping practices. Six months later, it was revealed that the USSR had broken all international and national laws “consciously and frequently” in regard to dumping of radioactive waster. Not only had they dumped barrels and reactor shells, but were engaged in an on-going process of dumping liquid used for cooling of nuclear reactors

Page 43: Project Infinity Research Report_052310

directly into the Ocean. The volume of such discharges was more than 80 million gallons, enough to fill a lake, and was still happening as no processing plants existed.

Furthermore, a total of 18 nuclear reactors from submarines and icebreakers, not simply reactor shells but full cores, was cast into shallow waters of the Kara sea. Seven of them were heavy with spent radioactive fuel, which has isotopes that remain radioactive for millions of years.

The total count for Moscow’s dumping was about 2.5 million curies of radiation, twice what the twelve Western nations admitted to dumping. This highly disturbing information finally provided the impetus for the Clinton Administration to lead the way to a permanent worldwide ban of Ocean dumping of radioactive waste, which was signed on November 12, 1993 by 37 nations.

So, what can we do? Unfortunately, the answer is not very much. Attempts to move such waste will more than likely spread it, rather than gather it. The majority of the waste, maybe as much as 80%, is no longer radioactive. Maybe with new technology in the future, we can find ways to deal with the remaining 20% of the deadly material.

In the meantime, the best we can do is map the entire Ocean in high resolution, which would allow more accurate GIS models to be developed to locate and predict the spread of such wastes from their dump sites. Additionally, sharing highly accurate maps with fisherman and other mariners will help protect humans from disturbing or unknowing interaction with such sites. Now if we could just post a memo for sea creatures to read…

Ocean Obama

Last year, 2009, was a big one for the sole surviving world superpower and Mother Ocean. After eight years of relative silence, the President of our USA declared June 2009 to be National Ocean Month. Now we have Mother’s Day and Ocean Month to help us remember to honor our Mothers; being in the Ten Commandments wasn’t clear enough for us. Will next month, June 2010, be Ocean Month again? Regardless, that was not the only step that Mr. President made towards Mother Ocean.

On September 10th, 2009, the White House Council on Environmental Quality released its “Interim Report of The Interagency Ocean Policy Task Force” proposing the National Policy for our Stewardship of the Ocean. Below is an abstract of the 38 page report:

I. VisionAn America whose stewardship ensures that the ocean, our coasts, and the Great Lakes are healthy and resilient, safe and productive, and understood and treasured so as to promote the well-being, prosperity, and security of present and future generations.

II. National Policy ContextThe Value of the Ocean, Our Coasts, and the Great Lakes America is intricately connected to and directly reliant on the ocean, our coasts, and the Great Lakes. Each of us – whether living and working in the country’s heartland or along its coasts – affects and is affected by

Page 44: Project Infinity Research Report_052310

these places. Their beauty inspires us, and their bounty contributes to our national well-being and security. Nearly half of our population is located in coastal counties. Our rich and productive coastal regions and waters account for the great majority of the national economy, totaling trillions of dollars each year, and support distant communities that may not even be aware of the connection between the land and sea. Millions of visitors enjoy our Nation’s seashores each year, contributing not only to the economy, but also to personal and communal satisfaction and fulfillment. The sea is both a refuge for spiritual reflection and a powerhouse of excitement for educating students of all ages and interests.

With over 95,000 miles of coastline and the largest exclusive economic zone in the world, our Nation benefits from a wealth of goods and services derived from the ocean, our coasts, and the Great Lakes. They provide food, fresh water, minerals, energy, and other natural resources and ecological benefits. They support tens of millions of jobs, and are a source of recreation. They also play a critical role in our Nation's transportation, economy, and trade, as well as in the global mobility and readiness of our Armed Forces and the maintenance of international peace and security.

The ocean supports human health and well-being in myriad ways, including as a source of healthy foods, pharmaceuticals, and other beneficial compounds. The ocean is a source of existing energy and offers numerous opportunities for renewable energy, which can help to secure our energy independence and mitigate climate change.

The ocean and Great Lakes exert significant influence over how our planet functions. Covering over 70 percent of the Earth, the ocean plays a primary role in our planet’s environment and natural operations, including weather and climate. The ocean’s ability to absorb and store heat from the atmosphere and transport it to other parts of the globe keeps daily temperatures within a livable range. The Great Lakes are the largest freshwater system on Earth, with 10,000 miles of shoreline and some 95 percent of the Nation’s fresh surface water. While we commonly refer to different oceans (Atlantic, Pacific, Arctic, etc.), it is important to recognize that all of these bodies of water are connected and influenced by each other. These linkages require our Nation to recognize that we benefit from and affect one global ocean.

The ocean shapes and sustains all life on Earth. We are dependent on the ocean for the air we breathe, the food we eat, and the water we drink. Though we may not think about it, processes on land and in the water, including biological processes, are intricately linked so that changes in one can have profound effects on the other. The ocean is both the beginning and the end of the Earth’s water cycle. Water that evaporates from the surface of the ocean becomes rain that falls on our fields and fills our aquifers. Much of this precipitation eventually finds rivers which flow back to the sea, starting the cycle once more.

Half of the oxygen we breathe comes from microscopic plants living in the ocean. Coastal barrier islands, coral reefs, mangroves, and wetlands serve as buffers between coastal communities and damaging floods and storms. Coastal wetlands are a nursery for many recreational and commercial fish species, provide essential habitat for many migratory birds and mammals, and serve as a natural filter helping to keep our waters clean. Ocean and coastal ecosystems absorb and detoxify many pollutants, recycle nutrients, and help control pests and pathogens. Marine ecosystems house biological diversity exceeding that found in the world’s rain forests.

Challenges Facing the Ocean, Our Coasts, and the Great Lakes The importance of ocean, coastal, and Great Lakes ecosystems cannot be overstated; simply put, we need them to survive. It is clear that these invaluable and life-sustaining assets are vulnerable to human activities and, at the same time, human communities are rendered more vulnerable when these resources are degraded. Yet, ocean, coastal, and Great Lakes ecosystems are experiencing an unprecedented rate of change due to human activities. We are only now beginning to understand the full extent of the direct and indirect consequences of our actions on these systems.

Climate change is impacting the ocean, our coasts, and the Great Lakes. Increasing water temperatures are altering habitats, migratory patterns, and ecosystem structure and function. Coastal communities are facing sea-level rise, inundation, increased threats from storms, erosion, and significant loss of coastal wetlands. The ocean’s ability to absorb carbon dioxide from the atmosphere buffers the impacts of climate change,

Page 45: Project Infinity Research Report_052310

but also causes the ocean to become more acidic, threatening not only the survival of individual species of marine life, but also entire marine ecosystems. The ocean buffers increased global temperatures by absorbing heat, but increasing temperatures are causing sea levels to rise by expanding seawater volume and melting land-based ice. Increased temperatures may eventually reduce the ocean’s ability to absorb carbon dioxide. Conversely, climate change is predicted to lower the water levels of the Great Lakes, thereby altering water cycles, habitats, and economic uses of the lakes.

Along many areas of our coasts and within the Great Lakes, biological diversity is in decline due to overfishing, introduction of invasive species, and loss and degradation of essential habitats from coastal development and associated human activities. The introduction of non-native species can carry significant ecological and economic costs. Human and marine ecosystem health are threatened by a range of challenges, including increased levels of exposure to toxins from harmful algal blooms and other sources, and greater contact with infectious agents. Areas in numerous bays, estuaries, gulfs, and the Great Lakes are now consistently low in or lacking oxygen, creating dead zones along our bays and coasts. Unsustainable fishing (e.g., overfishing) remains a serious concern with consequences for marine ecosystems and human communities. In the Arctic, environmental changes are revealing the vulnerability of its ecosystems. These changes are increasing stressors and impacts on the ecosystems, people, and communities in the region, and are presenting new domestic and international management challenges.

Many of these concerns are attributable not only to activities within marine and Great Lakes ecosystems, but also to actions that take place in our Nation’s interior. For example, our industries, agricultural and transportation operations, cities, and suburbs generate various forms of pollution. Industrial operations emit pollutants, such as nitrogen and mercury, into the atmosphere that often find their way into the ocean and Great Lakes. Rain washes residues, chemicals, and oily runoff from our roadways into our estuaries and coastal waters. Heavy rainfall events can wash sediment, pesticides, and nutrients from our fields, lawns, and agricultural operations into our waters. Urban and suburban development, including the construction of roads, highways, and other infrastructure, as well as modification to rivers and streams, can adversely affect the habitats of aquatic and terrestrial species.

Demands on the ocean, our coasts, and the Great Lakes are intensifying, spurred by population growth, migration to coastal areas, and economic activities. Energy development, shipping, aquaculture, and emerging security requirements are examples of new or expanding uses expected to place increasing demands on our ocean, coastal, and Great Lakes ecosystems. As these demands increase, we must also preserve the abundant and sustainable marine resources and healthy ecosystems that are critical to the well-being and continued prosperity of our Nation.

III. PolicyAmerica’s stewardship of the ocean, our coasts, and the Great Lakes is intrinsically and intimately linked to environmental sustainability, human health and well-being, national prosperity, adaptation to climate and other environmental changes, social justice, international diplomacy, and national and homeland security. Therefore, it is the policy of the United States to:

1. Healthy and Resilient Ocean, Coasts, and Great Lakes Use the best available science and knowledge to inform decisions affecting the ocean, our coasts,

and the Great Lakes, and enhance humanity’s capacity to understand, respond, and adapt to a changing global environment.

2. Safe and Productive Ocean, Coasts, and Great Lakes Support sustainable, safe, secure, and productive uses of the ocean, our coasts, and the Great Lakes;

Respect and preserve our Nation’s maritime heritage, including our social, cultural, and historical values; and

Exercise rights and jurisdiction and perform duties in accordance with applicable international law, including respect for and preservation of navigational rights and freedoms, which are essential for the global economy and international peace and security.

Page 46: Project Infinity Research Report_052310

3. Understood and Treasured Ocean, Coasts, and Great Lakes Increase scientific understanding of ocean, coastal, and Great Lakes ecosystems as part of the global

interconnected systems of air, land, ice, and water, including their relationships to humans and their activities;

Improve our understanding and awareness of changing environmental conditions, trends, and their causes, and of human activities taking place in ocean, coastal, and Great Lakes waters; and Foster a public understanding of the value of the ocean, our coasts, and the Great Lakes to build a foundation for improved stewardship.

The United States will promote the objectives of this policy by: Ensuring a comprehensive and collaborative framework for the stewardship of the ocean, our coasts,

and the Great Lakes that facilitates cohesive actions across the Federal Government, as well as participation of State, tribal, and local authorities, regional governance structures, non- governmental organizations, the public, and the private sector;

Cooperating and exercising leadership at the international level, including by joining the Law of the Sea Convention;

IMPLEMENTATION STRATEGY

PROPOSED NATIONAL PRIORITY OBJECTIVESH OW W E D O B USINESS 1. Ecosystem-Based Management: Adopt ecosystem-based management as a foundational principle for the comprehensive management of the ocean, our coasts, and the Great Lakes.2. Coastal and Marine Spatial Planning: Implement comprehensive, integrated, ecosystem-based coastal and marine spatial planning and management in the United States.3. Inform Decisions and Improve Understanding: Increase knowledge to continually inform and improve management and policy decisions and the capacity to respond to change and challenges. Better educate the public through formal and informal programs about the ocean, our coasts, and the Great Lakes.4. Coordinate and Support: Better coordinate and support Federal, State, tribal, local, and regional management of the ocean, our coasts, and the Great Lakes. Improve coordination and integration across the Federal Government, and as appropriate, engage with the international community.

A REAS OF S PECIAL E MPHASIS 1. Resiliency and Adaptation to Climate Change and Ocean Acidification: Strengthen resiliency of coastal communities and marine and Great Lakes environments and their abilities to adapt to climate change impacts and ocean acidification.2. Regional Ecosystem Protection and Restoration: Establish and implement an integrated ecosystem protection and restoration strategy that is science-based and aligns conservation and restoration goals at the Federal, State, tribal, local, and regional levels.3. Water Quality and Sustainable Practices on Land: Enhance water quality in the ocean, along our coasts, and in the Great Lakes by promoting and implementing sustainable practices on land.4. Changing Conditions in the Arctic: Address environmental stewardship needs in the Arctic Ocean and adjacent coastal areas in the face of climate-induced and other environmental changes.5. Ocean, Coastal, and Great Lakes Observations and Infrastructure: Strengthen and integrate Federal and non-Federal ocean observing systems, sensors, and data collection platforms into a national system and integrate that system into international observation efforts.

3. Inform Decisions and Improve Understanding: …Success in building our knowledge and applying it to improve management also relies on an engaged and informed public. Many Americans do not realize the importance of the ocean, our coasts, and the Great Lakes to their daily lives, the benefits they provide, or the possibilities they present for further discovery. There is great opportunity to raise awareness and identify ways we can help protect our waters and their

Page 47: Project Infinity Research Report_052310

resources.Inform and ImproveThe Plan Should Address:• Identification of priority issues in addressing emerging topics and change in ocean, coastal, and Great Lakes ecosystems and processes;• Specific scientific requirements and research needs, including the need for reconciling inconsistent standards, physical infrastructure, research platforms, organizations, and data management, to identify critical gaps, ensure high quality data, and provide information necessary to inform management, including mechanisms to transition research results into information products and tools for management;• The development of a more comprehensive awareness of environmental conditions and trends and human activities that take place in the ocean, coastal, and Great Lakes environments; and• Requirements for routine integrated ecosystem assessments and forecasts, including impacts related to climate change, to address vulnerability, risks, and resiliency, and inform tradeoffs and priority-setting.

EducateThe Plan Should Address:• Challenges, gaps, opportunities, and effective strategies for training and recruiting the current and next generation of disciplinary and interdisciplinary scientists, technicians, operators, managers, and policy makers, with a particular focus on the needs of disadvantaged or under-served communities; and• Identification of successful formal and informal education and public outreach approaches, including their application toward a focused nation-wide campaign to build public awareness, engagement, understanding, and informed decision-making, with specific emphasis on the state of ecosystems.

5. Ocean Coastal, and Great Lakes Observations and Infrastructure: Strengthen and integrate Federal and non-Federal ocean observing systems, sensors, and data collection platforms into a national system and integrate that system into international observation efforts.

Obstacles and OpportunitiesOur ability to understand weather, climate, and ocean conditions, to forecast key environmental processes, and to strengthen ocean management decision-making at all levels is informed by a sound knowledge base. Efficient and effective coordination of the many available tools, continued development of new tools and infrastructure, and integration of them into a cohesive, unified, robust system is becoming increasingly difficult as an ever increasing number of data collection and processing systems come on line. New ground-breaking observation technologies give us the ability to observe and study global processes at all scales. These new tools, if fully integrated, will significantly advance our knowledge and understanding of the ocean, our coasts, and the Great Lakes. Furthermore, successful integration of new tools and data will improve our ability to engage in science-based decision-making and ecosystem-based management by ensuring that biological, ecological, and social data and processes are included in the calculus.

The Plan Should Address: A nationally integrated system of ocean, coastal, and Great Lakes observing systems, comprised of

Federal and non-Federal components, and cooperation with international partners and organizations, as appropriate;

Regional and national needs for ocean information, to gather specific data on key ocean, coastal, and Great Lakes variables that are required to support the areas of special emphasis and other national needs;

The use of unmanned vehicles and remote sensing platforms and satellites to gather data on the health and productivity of the ocean, our coasts, and the Great Lakes;

The capabilities and gaps of the National Oceanographic Fleet of ships and related facilities; and Data management, communication, access, and modeling systems for the timely integration and

dissemination of data and information products.

Page 48: Project Infinity Research Report_052310

As the previous four-page abstract clearly demonstrates, having an complete Ocean map would greatly help America comply with our overall Policy and reach our objectives. However, a complete map our 4 million square mile EEZ will be required for this proposed National Policy to even have any chance at success. To attempt this immense effort without high resolution bathymetry to provide a foundation for the required “scientific-based decision-making” would surely result in failure, as the map provides basis for measurement and planning.

Furthermore, it appears clear that White House is in favor of joining and finally ratifying the UN Law of the Sea Treaty, which means that the US finally plans to make a fully legal and geographically maximized claim on its EEZ. As we can claim unto 350 miles from our coastline, if our continental shelf extends that far; we will require a high resolution map to support such claims. Furthermore, once the world superpower signs the treaty and prepares to formalize its EEZ claim, every other nation will want to follow suit and secure their claims. This trend will create a massive increase in the worldwide demand for accurate and complete bathymetry, especially with a few hundred miles of any coastline, which happens to be about 60% of the Ocean.

On December 9th, 2009, the Interagency Ocean Policy Task Force released its “Interim Framework for Effective Coastal and Marine Spatial Planning”. No need for an abstract on this framework. It is blatantly obvious that attempting to do complex “Spatial Planning” without a comprehensive map of the area is next to impossible. Proper GIS models will require accurate and complete bathymetry for our EEZ. Without proper models, we cannot hope to achieve successful planning. An Ocean map is a basic information requirement for “Effective Coastal and Marine Spatial Planning”.