Skip to main content

Passing the test of complex technologies

Seemingly small technological failings can have enormous consequences. When they are strung together in complex systems, the results can be catastrophic.  

In 1996, Europe’s then-newest unmanned satellite-launching rocket, the Ariane 5, exploded just seconds after taking off on its maiden flight from French Guiana. Onboard was a $500 million set of four scientific satellites created to study how the Earth’s magnetic field interacts with solar winds.

According to the New York Times Magazine, the rocket’s self-destruction was triggered when its guidance computer tried to convert a 64-bit floating-point number concerning the rocket’s lateral velocity into a 16-bit integer, resulting in an overflow error that shut the guidance system down. It then passed control to an identical backup computer, but the second computer had also failed the same way at the same time because it was running the same software.

Three years later, NASA’s $125 million Mars Climate Orbiter burned up in the Martian atmosphere due to flawed assumptions about the conversion of acceleration data between metric and English units by the ground crew and the software onboard the spacecraft. What was intended to be a day of celebration of the craft’s arrival into Mars orbit turned out quite differently due to this misunderstanding regarding units of measurement.

Disasters like these involve multiple failings – of design, validation, and interactions of humans with one another and with the system. And deficiencies in these same categories lead to system shortcomings of lesser magnitude but higher frequency that affect many of us every day in one way or another. Interoperability validation is a particular concern as software-centric systems become more numerous and complex. When devices using different technologies, or even the same basic technology implemented differently, are combined into a single system, they need to be seamlessly interoperable. When they are not – when they prove incompatible – negative consequences large and small usually follow. There is tension here for developers who are striving for performance improvements and competitive advantages for their products. As technologies continue to evolve, compatibility issues create a rolling challenge. Standards are key to striking the right balance and promoting the development of ecosystems that serve customers well.

There is no doubt that the widespread adoption of software-centric systems has already yielded a host of benefits. It is changing the speed at which enterprises innovate, grow, and support their customers. It raises productivity, reduces time to market, and fulfills customer demands by leveraging information collected digitally. Combined with advanced analytics and data visualization, that information provides the insights needed for optimizing customer experience with both current products and solutions, and those still under development. Applied together with advanced hardware technology, advanced software technology is fundamental for accomplishing the digital transformation that many organizations are currently working to achieve. And speaking of transformation, one need only look to recent videos of the Perseverance rover successfully landing on Mars to see how much has changed in the U.S. space program since the Mars Climate Orbiter experience.

The challenge is to find new approaches for testing those advanced technologies, approaches that can efficiently reveal potential problems with performance, user experience, security, and interoperability. All of this needs to be done in real-world conditions before the technologies become embedded into devices and deployed in the field.  

Addressing this challenge requires a corresponding emphasis on software-centric design and test solutions. With so much data being produced by the systems under test, obtaining raw test results alone is not enough. Sophisticated analysis and insightful interpretation of those results are critical, particularly for research and development. Automation capabilities are required to increase productivity and ensure consistency of test steps across units and under different operating conditions. As performance limits are pushed to new heights, simulation is more important than ever to gain confidence in a design prior to prototype fabrication. 

Information security continues to grow in importance for the design and test of today’s products and solutions. With new generations of malware and those who seek to apply them now on the attack 24/7, security considerations cannot be left until deployment time – they must be addressed early in the design and increasingly in the hardware in addition to the software. One need only consider end applications in the financial, utility, communications, national defense, and transportation sectors to realize the importance of keeping the systems secure, and the potential consequences of failing to do so. 

The transformation of the automobile and the associated infrastructure provides a good example of these challenges in action. The latest vehicles feature a staggering amount of new hardware and software technology, enabling everything from the powertrain, to the Advanced Driver-Assistance Systems (ADAS), to the progressing levels of autonomous driving and more. New technology for vehicle-to-everything communications, or V2X, will enable vehicles to communicate with each other and elements of the traffic system, including roadside infrastructure, bicyclists, and pedestrians. If it succeeds, according to the U.S. Department of Transportation, V2X can either eliminate or reduce the severity of up to 80 percent of auto accidents. It can also dramatically reduce travel times and slash fuel consumption. But it is complicated technology. Accounting for traffic patterns, adjusting to road conditions, responding to risks outside of normal sightlines, and recognizing other driving hazards is a complex undertaking.  

To help ensure V2X’s success and the success of complex new software-centric systems generally, the design and test industry is applying its own innovations – in richer system modeling, high-frequency, high protocol content communications, intelligent automated software tests, and advanced manufacturing solutions. We are also partnering deeply with market-leading customers to co-innovate at their pace while also working to advance standards development so that new technology ecosystems develop quickly, safely, and cost-effectively for customers. It is exciting work, and as it is said, “failure is not an option.”

The post Passing the test of complex technologies appeared first on SD Times.



from SD Times https://ift.tt/2O67lRH

Comments

Popular posts from this blog

Difference between Web Designer and Web Developer Neeraj Mishra The Crazy Programmer

Have you ever wondered about the distinctions between web developers’ and web designers’ duties and obligations? You’re not alone! Many people have trouble distinguishing between these two. Although they collaborate to publish new websites on the internet, web developers and web designers play very different roles. To put these job possibilities into perspective, consider the construction of a house. To create a vision for the house, including the visual components, the space planning and layout, the materials, and the overall appearance and sense of the space, you need an architect. That said, to translate an idea into a building, you need construction professionals to take those architectural drawings and put them into practice. Image Source In a similar vein, web development and design work together to create websites. Let’s examine the major responsibilities and distinctions between web developers and web designers. Let’s get going, shall we? What Does a Web Designer Do?

A guide to data integration tools

CData Software is a leader in data access and connectivity solutions. It specializes in the development of data drivers and data access technologies for real-time access to online or on-premise applications, databases and web APIs. The company is focused on bringing data connectivity capabilities natively into tools organizations already use. It also features ETL/ELT solutions, enterprise connectors, and data visualization. Matillion ’s data transformation software empowers customers to extract data from a wide number of sources, load it into their chosen cloud data warehouse (CDW) and transform that data from its siloed source state, into analytics-ready insights – prepared for advanced analytics, machine learning, and artificial intelligence use cases. Only Matillion is purpose-built for Snowflake, Amazon Redshift, Google BigQuery, and Microsoft Azure, enabling businesses to achieve new levels of simplicity, speed, scale, and savings. Trusted by companies of all sizes to meet

2022: The year of hybrid work

Remote work was once considered a luxury to many, but in 2020, it became a necessity for a large portion of the workforce, as the scary and unknown COVID-19 virus sickened and even took the lives of so many people around the world.  Some workers were able to thrive in a remote setting, while others felt isolated and struggled to keep up a balance between their work and home lives. Last year saw the availability of life-saving vaccines, so companies were able to start having the conversation about what to do next. Should they keep everyone remote? Should they go back to working in the office full time? Or should they do something in between? Enter hybrid work, which offers a mix of the two. A Fall 2021 study conducted by Google revealed that over 75% of survey respondents expect hybrid work to become a standard practice within their organization within the next three years.  Thus, two years after the world abruptly shifted to widespread adoption of remote work, we are declaring 20