Seemingly small technological failings can have enormous consequences. When they are strung together in complex systems, the results can be catastrophic.
In 1996, Europe’s then-newest unmanned satellite-launching rocket, the Ariane 5, exploded just seconds after taking off on its maiden flight from French Guiana. Onboard was a $500 million set of four scientific satellites created to study how the Earth’s magnetic field interacts with solar winds.
According to the New York Times Magazine, the rocket’s self-destruction was triggered when its guidance computer tried to convert a 64-bit floating-point number concerning the rocket’s lateral velocity into a 16-bit integer, resulting in an overflow error that shut the guidance system down. It then passed control to an identical backup computer, but the second computer had also failed the same way at the same time because it was running the same software.
Three years later, NASA’s $125 million Mars Climate Orbiter burned up in the Martian atmosphere due to flawed assumptions about the conversion of acceleration data between metric and English units by the ground crew and the software onboard the spacecraft. What was intended to be a day of celebration of the craft’s arrival into Mars orbit turned out quite differently due to this misunderstanding regarding units of measurement.
Disasters like these involve multiple failings – of design, validation, and interactions of humans with one another and with the system. And deficiencies in these same categories lead to system shortcomings of lesser magnitude but higher frequency that affect many of us every day in one way or another. Interoperability validation is a particular concern as software-centric systems become more numerous and complex. When devices using different technologies, or even the same basic technology implemented differently, are combined into a single system, they need to be seamlessly interoperable. When they are not – when they prove incompatible – negative consequences large and small usually follow. There is tension here for developers who are striving for performance improvements and competitive advantages for their products. As technologies continue to evolve, compatibility issues create a rolling challenge. Standards are key to striking the right balance and promoting the development of ecosystems that serve customers well.
There is no doubt that the widespread adoption of software-centric systems has already yielded a host of benefits. It is changing the speed at which enterprises innovate, grow, and support their customers. It raises productivity, reduces time to market, and fulfills customer demands by leveraging information collected digitally. Combined with advanced analytics and data visualization, that information provides the insights needed for optimizing customer experience with both current products and solutions, and those still under development. Applied together with advanced hardware technology, advanced software technology is fundamental for accomplishing the digital transformation that many organizations are currently working to achieve. And speaking of transformation, one need only look to recent videos of the Perseverance rover successfully landing on Mars to see how much has changed in the U.S. space program since the Mars Climate Orbiter experience.
The challenge is to find new approaches for testing those advanced technologies, approaches that can efficiently reveal potential problems with performance, user experience, security, and interoperability. All of this needs to be done in real-world conditions before the technologies become embedded into devices and deployed in the field.
Addressing this challenge requires a corresponding emphasis on software-centric design and test solutions. With so much data being produced by the systems under test, obtaining raw test results alone is not enough. Sophisticated analysis and insightful interpretation of those results are critical, particularly for research and development. Automation capabilities are required to increase productivity and ensure consistency of test steps across units and under different operating conditions. As performance limits are pushed to new heights, simulation is more important than ever to gain confidence in a design prior to prototype fabrication.
Information security continues to grow in importance for the design and test of today’s products and solutions. With new generations of malware and those who seek to apply them now on the attack 24/7, security considerations cannot be left until deployment time – they must be addressed early in the design and increasingly in the hardware in addition to the software. One need only consider end applications in the financial, utility, communications, national defense, and transportation sectors to realize the importance of keeping the systems secure, and the potential consequences of failing to do so.
The transformation of the automobile and the associated infrastructure provides a good example of these challenges in action. The latest vehicles feature a staggering amount of new hardware and software technology, enabling everything from the powertrain, to the Advanced Driver-Assistance Systems (ADAS), to the progressing levels of autonomous driving and more. New technology for vehicle-to-everything communications, or V2X, will enable vehicles to communicate with each other and elements of the traffic system, including roadside infrastructure, bicyclists, and pedestrians. If it succeeds, according to the U.S. Department of Transportation, V2X can either eliminate or reduce the severity of up to 80 percent of auto accidents. It can also dramatically reduce travel times and slash fuel consumption. But it is complicated technology. Accounting for traffic patterns, adjusting to road conditions, responding to risks outside of normal sightlines, and recognizing other driving hazards is a complex undertaking.
To help ensure V2X’s success and the success of complex new software-centric systems generally, the design and test industry is applying its own innovations – in richer system modeling, high-frequency, high protocol content communications, intelligent automated software tests, and advanced manufacturing solutions. We are also partnering deeply with market-leading customers to co-innovate at their pace while also working to advance standards development so that new technology ecosystems develop quickly, safely, and cost-effectively for customers. It is exciting work, and as it is said, “failure is not an option.”
The post Passing the test of complex technologies appeared first on SD Times.
from SD Times https://ift.tt/2O67lRH
Comments
Post a Comment