Computer Scientist, National Institute of Standards and Technology
Title: Network of 'Things'
Abstract: System primitives allow for formalisms, reasoning, simulations, and reliability and security risk trade-offs to be formulated and argued. In this work, five core primitives belonging to most distributed systems are presented. These primitives apply well to systems with large amounts of data, scalability concerns, heterogeneity concerns, temporal concerns, and elements of unknown pedigree with possible nefarious intent. These primitives form the basic building blocks for a Network of ‘Things’ (NoT) [NIST SP 800-183], including the Internet of Things (IoT). This talk discusses the underlying and foundational science of IoT and thus gives the audience a general understanding of what IoT is all about. The talk will also touch on the university curriculum education that needs to be taught to be “workplace-ready” as an IoT engineer.
Bio: Jeffrey Voas is an author and innovator. He is currently a computer scientist at the US National Institute of Standards and Technology (NIST) in Gaithersburg, MD. Before joining NIST, Voas was an entrepreneur and co-founded Cigital that is now a part of Synopsys (Nasdaq: SNPS). He has served as the IEEE Reliability Society President (2003-2005, 2009-2010, 2017), and served as an IEEE Director (2011-2012). Voas co-authored two John Wiley books (Software Assessment: Reliability, Safety, and Testability  and Software Fault Injection: Inoculating Software Against Errors . Voas received his undergraduate degree in computer engineering from Tulane University (1985), and received his M.S. and Ph.D. in computer science from the College of William and Mary (1986, 1990 respectively). Voas is a Fellow of the IEEE, member of Eta Kappa Nu, Fellow of the Institution of Engineering and Technology (IET), and Fellow of the American Association for the Advancement of Science (AAAS).
Piero P. Bonissone
CEO, Piero P Bonissone Analytics, LLC
Retired Chief Scientist, Coolidge Fellow GE Global Research
Title: Analytics for Industrial Internet Applications
Abstract: The Industrial Internet is the third disruptive wave, after the Industrial and the Internet revolutions. It is transforming our industries, just like the Internet revolution transformed our commerce. In this new context, we face a combination of hyper-connected intelligent machines, interacting with other machines and people, and generating large amounts data that need to be analyzed by descriptive, predictive, and prescriptive models. As a result, we see the resurgence of analytics as a key differentiator for creating new services, the emergence of cloud computing as an enabling technology for service delivery, and the growth of crowdsourcing as a new phenomenon in which people play critical roles in creating information and shaping decisions in a variety of problems. We explore the intersection of these three concepts from the perspective of a machine-learning researcher and show how his job and roles have evolved over time.
In the past, analytic model creation was an artisanal process, as models were handcrafted by experienced, knowledgeable model-builders. More recently, the use of meta-heuristics, such as evolutionary algorithms, has provided us with limited levels of automation in model building and maintenance. In the short future, we expect data-driven analytic models to become a commodity. We envision having access to a large number of data-driven models, obtained by a combination of crowdsourcing, cloud-based evolutionary algorithms, outsourcing, in-house development, and legacy models. In this context, the critical issue will be model ensemble selection and fusion, rather than model generation.
First, we will review the application of data-driven analytic models to assets diagnostics and prognostics, such as aircraft engines, medical imaging devices, and locomotives. We will cover a case study on prediction of remaining useful life for each unit in a fleet of locomotives using fuzzy models.
Then we will explore the evolution of analytic models with the advent of cloud computing, and propose the use of customized model ensembles on demand, inspired by Lazy Learning. This approach is agnostic with respect to the origin of the models, making it scalable and suitable for a variety of applications. We successfully tested this approach in a regression problem for a power plant management application, using two different sources of models: bootstrapped neural networks, and GP-created symbolic regression models evolved in the cloud. We will also present results on the fusion of models for FlyQuest, a GE-sponsored Kaggle competition in which we crowd-sourced the generation of models predicting the estimated runway and gateway arrival (ERA, EGA) over a month of US flights.
Finally, we will explore research trends, challenges and opportunities for Machine Learning techniques in this emerging context of big data and cloud computing.
Bio: Dr. Bonissone is an independent consultant specialized in the use of analytics for Industrial Internet applications. He provides consulting services in machine learning (ML) and analytic applications, ranging from project definition and risk abatement, project evaluation, transition from development to deployment, and model maintenance. Recently, he defined and shaped new projects for GE Oil & Gas, prior to their integration with Baker Hughes Inc. (BHI). During the previous two years, he was an Advanced Analytics Advisor for Schlumberger (SLB), where he played a key role in SLB Digital Transformation, such as part forecasting and market intelligence, PHM projects related to equipment reliability, etc. He was also a consultant for DIGILE, a Finnish startup, and Ford Motor Co.
A former Chief Scientist at GE Global Research, where he retired in 2014 after 34 year of service, Dr. Bonissone has been a pioneer in the field of analytics, machine learning, fuzzy logic, AI, and soft computing applications. During the eighties, he conceived and developed the Diesel Electric Locomotive Troubleshooting Aid (DELTA), one of the first fielded expert systems that helped maintenance technicians in troubleshooting diesel-electric locomotives. He has been the PI in many DARPA programs, from Strategic Computing Initiative, to Pilot's Associate, Submarine Operational Automation System, and Planning Initiative (ARPI). During the nineties, he led many projects in fuzzy control, from the hierarchical fuzzy control of turbo-shaft engines to the use of fuzzy logic in dishwashers, locomotives, and resonant converters for power supplies. He designed and integrated case-based and fuzzy-neural systems to accurately estimate the value of single-family residential properties when used as mortgage collaterals. In early 2000, he designed a fuzzy-rule based classifier, trained by evolutionary algorithms, to automate the placement of insurance applications for long term care and term life, while minimizing the variance of their decisions. This classifier has been in production since 2003. Recently he focused on the development of data-driven analytic models for assets diagnostics and prognostic, including the prediction of remaining life for each locomotive in a fleet, to perform efficient assets selection. His current interests are multi-criteria decision making systems to support PHM applications, ensemble learning to leverage the diversity of multiple models, and automation of model lifecycle to create, deploy, and maintain analytic models, providing customized performance while adapting to avoid obsolescence.
He is a Fellow of the Institute of Electrical and Electronics Engineers (IEEE) , the Association for the Advancement of Artificial Intelligence (AAAI), the International Fuzzy Systems Association (IFSA), and a Coolidge Fellow at GE Global Research. He received the 2012 Fuzzy Systems Pioneer Award from the IEEE Computational Intelligence Society (CIS). From 2010 to 2015, he chaired the Scientific Committee of the European Centre for Soft Computing. In 2008 he received the II Cajastur International Prize for Soft Computing from the European Centre of Soft Computing. In 2005 he received the Meritorious Service Award from the IEEE CIS. He has received two Dushman Awards from GE Global Research. He served as Editor-in-Chief of the International Journal of Approximate Reasoning for 13 years. He is in the editorial board of five technical journals and is Editor at Large of the IEEE Computational Intelligence Magazine. He co-edited six books and has 150+ publications in refereed journals, book chapters, and conference proceedings, with 7,900 citations, an H-Index of 39 and an i10-index of 143 (by Google Scholar). He received 70 patents issued by the US Patent Office (and 20+ pending patents). From 1982 until 2005 he has been an Adjunct Professor at Rensselaer Polytechnic Institute, in Troy NY, where he supervised 5 PhD theses and 34 Master theses. He co-chaired 12 scientific conferences focused on Multi-Criteria Decision-Making, Fuzzy sets, Diagnostics, Prognostics, and Uncertainty Mgmt. in AI. He has been a member of the IEEE Fellow Committee in 2007-09; 2012-14, and 2016-17. In 2002, while serving as President of the IEEE Neural Networks Society (now CIS), he was a member of the IEEE Technical Activity Board. He has been an Executive Committee member of NNC/NNS/CIS society in 1993-2012; 2016-18 and an IEEE CIS Distinguished Lecturer in 2004-14; 2017-19.
General Manager, Toshiba Electronics Asia (Singapore)
Title: Towards Consumer Storage Area Networking for Personalized Big Data Analysis
Abstract: Edge computing or performing data processing at the edge of the network is a natural technological movement in the recent significant wave of big data analysis and AI where there is some data that needs to be analysed without being transferred to, or being shared with public cloud services for several reasons such as data processing latency requirements and strict data sharing policies enforced for the personal data to process. An edge computer cluster integrated with a local storage system is one key component of edge computing, where an appropriate design of storage area networking can have a potential impact not only on the data processing performance but also system security of the edge computing system. The talk will cover existing use cases, solutions, experiments and issues in the area of consumer storage area networking.
Bio: He received Ph.D. degrees in Information and Computer Sciences from Osaka University in 1994. He has been active in standardizing Internet security and mobility protocols for 17 years. He served as Chair of IEEE 802.21a and IEEE 802.21d, and Vice Chair of ZigBee Neighborhood Area Network. He is a main contributor to RFC 5191 (PANA - Protocol for carrying Authentication for Network Access). He received IEEE Region 1 Technology Innovation Award 2008 for Innovative and Exemplary Contributions to the Field of Internet Mobility and Security related Research and Standards. His current interest is storage area networking in distributed computing environment. He is a general manager of Toshiba Electronics Asia (Singapore) since 2015.