|Data processing||- Data Processing|
The end product of the UAV survey work has to be an accurate, high resolution, multi-layer image / map, from which the user, or, a computer system, can infer sub-surface geological features, from the selective superposition of the available images and maps.
Software plays a crucial role in transforming the myriad of raw data blocks into an accurate, coherent, large scale image, or, map of the survey region.
The quantity of data can be enormous, for example, in excess of 43,000 GBytes of data per day from 10 UAVs performing stereo, multi-spectral imaging, necessitating the use of a High Performance Computing system, typically comprising from 64 to 1,024 high end 64 bit PCs, all running LINUX because of its good scalability in a multi-processor system and its reliability, to parallel process the survey data, in a reasonable time.
Although Linux has proven benefits when it comes to the scaleability of multiple processing nodes, Microsoft have introduced their Microsoft Compute Cluster Server 2003 for use in computational intensive tasks.
Hewlett Packard computer cluster, shown at the SEG 2006 Conference.
In the aerial reconnaissance context, software could be used to subtract images and maps taken at different times, such as different days, to indicate the appearance or disappearance of objects and personnel. In fact, with sufficiently smart software and regular imaging of the terrain, the software should be able to indicate the movements of objects and personnel, even objects and personnel that are heavily camouflaged.
In an ongoing aerial surveillance situation, various portions of the large data sets will be updated in real time, following the receipt of the data, via a fast Free Space Optics data link and in a Network Centric infrastructure, which will then automatically be accessible by users connected to the Internet.
Indeed, once the UAV platform and sensor systems are stabilised, manufactured and used in increasing volumes, the principle development work will be in the areas of:
The final and most impressive software element is the data interpretation software, which is able automatically to analyse all the available data and determine the location and attributes of geologically interesting features, such as mineral, oil and gas deposits.
AUSTIN, Texas — Nov. 18, 2008 /PRNewswire/ — Today at the Supercomputing 2008 conference, Microsoft Corp. debuted in the top 10 of the world's most powerful supercomputers with Shanghai Supercomputer Center and Dawning Information Industry Co. Ltd., which ranked at No. 10 with 180.6 teraflops, the parallel computing speed, and 77.5 percent efficiency. A truly incredible achievement considering that 12 months ago in Reno, Nevada, Microsoft was at 116 on the Top500 list at Top500.org. This is on the heels of Windows HPC Server 2008 releasing to the manufacturing industry in September.
Reduces costs and complexity of high-performance computing
A broad platform for software vendors and an expanded playing field for hardware manufacturers
Deep investments in HPC and commitment to driving innovation
June 9, 2006
- from http://www.eweek.com/article2/0 ,1895,1974530,00.asp
Microsoft is finally ready to enter the high-performance computing market, a technology dominated by open-source Linux technology. The Redmond, Wash., software maker released Windows Compute Cluster Server 2003 to manufacturing on June 9, with general availability of the product scheduled for August.It will be sold via volume licensing and OEM licensing for an estimated price of $469 a node, but prices will vary depending on the license and volume, John Borozan, group product manager for the Windows Server Division, told eWEEK. Evaluation copies of Windows Compute Cluster Server 2003, a 64-bit operating system for industry-standard x64 processors, will be handed out to attendees of Microsoft's TechEd 2006 conference in Boston the week of June 12, he said.
This is Microsoft's first software offering designed specifically to run parallel, high-performance computing applications for customers, and it provides a platform that can be deployed, operated and integrated with existing infrastructure and tools. Customers can also leverage their existing development skills using Visual Studio 2005, Borozan said.
The upcoming availability of the Windows Compute Cluster Server marks a milestone for Microsoft, which is a late-comer to a market largely dominated by Linux software.
While Microsoft will release a single 64-bit-only version of the software, it will run on all the hardware platforms supported by Windows Server 2003 Service Pack 1, on which it is based. All the major OEMs, including IBM, Hewlett-Packard, Dell and NEC Solutions America, as well as the major interconnect vendors, have announced support for the product.
Customer demand for HPC is being driven by increased performance in processors per compute node, the low acquisition price per node and the overall price/performance of compute clusters. These trends are driving new customers to adopt HPC to replace or supplement live, physical experiments with computer-simulated modeling, tests and analysis, Borozan said.
Analyst firm IDC says it expects unit shipments for HPC to expand by more than 12 percent annually over the next five years, and that high-performance computing clusters will see substantial customer adoption in the lower-end capacity segments of the market.
Uses of the Windows Compute Cluster Server by early adopters span oil and gas reservoir simulation and seismic processing; life sciences use for simulations of enzyme catalysis and protein folding; and vehicle design and safety improvements.