From Big Data to Big Knowledge: Knowledge Engineering with Big Data
Big Data processing concerns large-volume, growing data sets with multiple, heterogeneous, autonomous sources, and explores complex and evolving relationships among data objects. This talk starts with a HACE theorem (http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6547630) that characterizes the features of the Big Data revolution, and presents BigKE, a big data knowledge engineering framework that handles fragmented knowledge modeling and online learning from multiple information sources, nonlinear fusion on fragmented knowledge, and automated demanddriven knowledge navigation. We discuss challenging issues and our ongoing research efforts with BigKE.
Xindong Wu is a Professor in the School of Computing and Informatics at the University of Louisiana at Lafayette (USA), a Yangtze River Scholar in the School of Computer Science and Information Engineering at the Hefei University of Technology (China), and a Fellow of the IEEE and the AAAS. He holds a PhD in Artificial Intelligence from the University of Edinburgh, Britain. His research interests include data mining, Big Data analytics, knowledge engineering, and Web systems. He is Steering Committee Chair of the IEEE International Conference on Data Mining (ICDM), Co-Editor-in-Chief of the ACM Transactions on Knowledge Discovery from Data, and Editor-in-Chief of the Springer Book Series on Advanced Information and Knowledge Processing (AI&KP).
Hearables: Continuous 24/7 monitoring of the state of body and mind
Future health systems require the means to assess and track the neural and physiological function of a user over long periods of time, and in the community. Human body responses are manifested through multiple, interacting modalities – the mechanical, electrical and chemical; yet, current physiological monitors (e.g. actigraphy, heart rate) largely lack in cross-modal ability, are inconvenient and/or stigmatizing. We address these challenges through an inconspicuous earpiece, which benefits from the relatively stable position of the ear canal with respect to vital organs. Equipped with miniature multimodal sensors, it robustly measures the brain, cardiac and respiratory functions. Comprehensive experiments validate each modality within the proposed earpiece, while its potential in wearable health monitoring is illustrated through case studies spanning these three functions. We further demonstrate how combining data from multiple sensors within such an integrated wearable device improves both the accuracy of measurements and the ability to deal with artifacts in real-world scenarios. This framework opens up the avenues for a subsequent use of a number of machine learning paradigms, from lifelong learning to Big Data, to be used in a real world application of utmost importance - new generation health systems.
Danilo P. Mandic is a Professor in signal processing with Imperial College London, UK, and has been working in the areas of adaptive signal processing and bioengineering. He is a Fellow of the IEEE, member of the Board of Governors of International Neural Networks Society (INNS), member of the Big Data Chapter within INNS and member of the IEEE SPS Technical Committee on Signal Processing Theory and Methods. He has received five best paper awards in Brain Computer Interface, runs the Smart Environments Lab at Imperial, and has more than 300 publications in journals and conferences. Prof Mandic has received the President Award for Excellence in Postgraduate Supervision at Imperial. He is a pioneer of Ear-EEG, a radically new in-the-ear-canal EEG recording system, and has extended this work to in-ear monitoring of vital signs. This work appeared in IEEE Spectrum, MIT Technology Review and has won several awards.