Friday, April 28, 2017

Fostering Innovation

Two ways of foster innovation are through the use of think tanks and government investments. 

Think tanks continue to evolve based upon the changing environment the function in and the challenges they attempt to address.  In the 2015 Global Think Tank Innovations Summit several key recommendations were made (McGann, 2015).  The first stressed the importance of think tank networks and associations.  The idea is to affiliate with think tanks tackling similar problems.  This is not surprising as collaboration is becoming increasingly important.  Related to this point, it was stressed that innovations and best practices should be shared.   Think tanks were encouraged to create newsletters as a means of sharing information.  They also encouraged the use of knowledge sharing platforms and to view other thinks tanks as potential partners.  On a practical front, they stress the importance of transparency with regards to funding.  As researchers, disclosing possible conflicts of interest is critical.

Related to the funding of think tanks is the role the government should play in innovation.  Under the neoclassical economic theory, the role of government policy is to correct market failures (Mazzucato, 2015). The thinking is that the government should step in only when there is a failure in the market, such as a monopoly.  The problem with this view is that markets are often short sited. An example of a long-term government investment that has recently paid off is the extraction of natural gas from shale formations.  In 1976, the Eastern Gas Shales Project was launched by Morgantown Energy Research Center and the Bureau of Mines.  Also, in 1976 the federal government formed the Gas Research Institute which was funded by a tax on natural gas production, resulting in billions of dollars being spent researching the extraction process.

Closer to home, or ear, is the technology used in smartphones.  The Internet it connects to exists thanks to the Defense Advanced Research Project Agency’s ARPANET. The GPS it uses to determine location came from a United States military’s Navstar project.  The fingerprint reader is based on technology created by a company founded by a professor who received a grant from the National Science Foundation.

The key takeaway is not that the government should throw money away on pointless research, but rather that it should invest in strategic world-changing research that only it is in a place to bring to fruition.  It is important to notice that the reward is not immediate.  Investments often take decades before a return is seen or before the impact is fully recognized.   The combination of strategic funding of think tanks with direct government facilitation of research could result in significant breakthroughs. Failure to make the investment may not be noticed immediately but will be felt by subsequent generations.

References
Mazzucato, M. (2015). The innovative state. Foreign Affairs, 94(1), 7-8.

McGann, J. G. (2015). 2015 Global Think Tank Innovations Summit Report: The Think Tank of the Future is Here Today.


Friday, April 21, 2017

Big Data in Genomics

Big Data holds tremendous promise in the field of Genomics.  It is inherently a Big Data problem, given the size and variety of genetic information.  It enables the detection predisposition towards diseases and may help in the fight against cancer.

One goal of genomic medicine is to understand how an individual’s DNA impacts their risk to different diseases (Leung, Delong, Alipanahi, & Frey, 2016). The idea is to treat the variations as variables and model the problem using machine learning techniques.  An example of a cell variable is the location where a protein binds to a strand of DNA containing a particular gene.

One problem with traditional genomics approaches is that they often rely on brittle, complex, and dated technologies.  The use of Big Data Analytics approach can reduce this risk.  The goal should be to design a scalable and portable system, with the goal of reproducibility in mind.

The application of this technology is on the horizon.  Foundation Medicine was created to enable the type of testing that Steve Jobs did during his battle with cancer (Regalado, 2013).  The company sells a test that sequences the DNA of a person with cancer with the hope that the factors driving cancer can be identified. The gathering of DNA sequences from individuals known to have cancer enables research previously not possible.  Collaboration amongst companies, such as H3 Biomedicine, accessing a common data set can lead to connections between genetic aberrations and disease (Adams, 2017).

The application of Big Data Analytics to Genomics is an exciting area.  As with many Big Data project, it has the potential to yield breakthrough results.  It is also impacted by the same data quality, architectural, and compliance challenges that face other Big Data projects.

References

Adams, B. (2017). H3 Biomedicine boosts its cancer collab with Foundation Medicine | FierceBiotech.   Retrieved from http://www.fiercebiotech.com/biotech/h3-biomedicine-boosts-its-cancer-collab-foundation-medicine

Leung, M. K., Delong, A., Alipanahi, B., & Frey, B. J. (2016). Machine learning in genomic medicine: a review of computational problems and data sets. Proceedings of the IEEE, 104(1), 176-197.

Regalado, A. (2013). Steve Jobs Legacy and the Foundation Medicine IPO.


Group Decision Making


Making a decision in a group is always challenging. If left unchecked, decisions are made on emotional or personality-based foundations, not logic and reason.

The Delphi Method of reaching decisions in a group is an iterative process managed by a facilitator (Mind Tools, 2017).   The idea is that a collection of experts are presented with a problem.  They do not meet, and may not know who the other experts are.  They respond to the question or problem in writing.  The facilitator then compiles the responses and circulates the result to the group.  The panel of experts evaluate the group's responses and respond having considered differing points of view.

Taking the position that there are essentially four decision-making approaches (Meier), the Delphi Method is essentially a consensus approach.  The consensus approach implies that the group discusses the decision until everyone agrees.  It should be used when the results of those decisions are of high importance or buy in by all the parties is critical.  Depending on the personalities involved, it is possible for a consensus not to be reached.  In the consulting style of decision making a single decision maker gathers various points of view, weighs them, and makes a decision.  The idea is the decision maker is guided by experts but has both the authority and accountability to make the decision independently.  A commanding style decision maker is similar except that they do not seek advice.  Lastly, the voting style of decision-making can be viewed as a relaxed version of the consensus approach.  Rather than relying on a unanimous decision, the majority can rule.  The challenge with a voting approach is that many of the parties may not feel obligated to act as a result of the outcome.  It does not foster buy-in and may make many team members feel as though their voice was not heard.  Also, if the voting is done in an open fashion, reasons other than a desire for the best outcome may cloud the participant's votes.

As with many things, there are tradeoffs with the various approaches.  The factors include the importance of team buy-in, how rapid the decision must be made, where accountability falls, and the impact on team morale.
References
Meier, J. D. 4 Decision Making Methods.

Mind Tools. (2017). The Delphi Method: Achieving Consensus Among Experts.   Retrieved from http://www.mindtools.com/pages/article/newTMC_95.htm


Using Big Data Analytics to Measure Learning


A trend in higher education is an increased focus on measuring learning (Adams Becker et al., 2017).  The basic idea is to apply Big Data Analytics techniques that have been in other industries to education.  The goal is to enrich the education experience while identifying students in need of intervention.

Learning analytics (LA) is used to describe the capture, analysis, and use of data in an education related environment.  LA takes a broad view of the education process, to include the environment where it takes place. By capturing large and rich datasets, institutions and learners can generate customized feedback to improve progress.   Just as marketing organizations attempt to gather a holistic view of their target audience, educational institutions are using similar approaches.  For example, social media content is combined with data captured using video cameras and other tracking technologies. Figure 1 presents a simple visual representation of the approach.  Information about what educational resources that are being utilized (such as a Library) are captured and used with the other data to gain insights into the students’ behaviors.

Predictive analysis is used to identify at-risk students prompting intervention.  For example, if a pattern associated with student drop out is identified the educational intuition can intervein and attempt to modify the student's trajectory. Likewise, analytics are performed on overall student activities to gain insight into course and program effectiveness.  This information can be used by curriculum designers to make improvements.

It is not unimaginable to envision the equivalent of shopping basket analysis (associative rules) to make suggestions to students about course or even degree programs.  For example, a student might receive an email saying “based upon your previous academic activities; we recommend you take the following courses and consider the following majors.”


Figure 1: Modern Education Approach

This approach to education has several issues to overcome.  There are serious legal and ethical questions regarding this degree of observation of students.  The previous example of using predictive analytics to identify students at risk of dropping out could also be used by an institution to artificially improve their retention rate. Likewise, without proper disclosure and consent, tracking a student’s movements and utilization of course materials is likely an invasion of privacy. 

The best approach to addressing these issues is with transparency, disclosure, and required consent.  Just as a user of a social network or search engine should be able to see and download all data that is being stored related to them, a student should have the same rights.  Controls should be in place to ensure that the collected data is accessed in an appropriate way.  For example, a student should grant permission for their parents to view their activity.

As with most Big Data applications, LA has both promise and challenges.  Moving to a data-driven educational system has the potential to enable better and previously unimaginable learning.  It also has the potential to take “teaching to the test” to a new level.



References

Adams Becker, S., Cummins, M., Davis, A., Freeman, A., Hall Giesinger, C., & Ananthanarayanan, V. (2017). NMC Horizon Report: 2017 Higher Education Edition.   Retrieved from http://www.nmc.org/publication/nmc-horizon-report-2017-higher-education-edition/


Tuesday, April 18, 2017

The importance of research

Research is very important in all aspects of our lives.  We take for granted many things that resulted from research.  There are many lifesaving discoveries based on research.  There are other products of research that we take for granted and do not fully understand.
In 1966 Leonard Baum and Ted Petrie published a paper on Markov chains (Baum & Petrie, 1966).   This work is generally referred to as a hidden Markov model.  The paper built upon the research of others and laid the groundwork for future advancements.  In 1975 Dr. James Baker used Baum and Petrie’s work to create a speech processing system called DRAGON (Rabiner, 1989).  In 1982, he and his wife founded Dragon Systems.  In 1997, Dragon Systems releases Dragon Naturally Speaking (Dragon).  ScanSoft acquired Dragon Systems, which was in turn acquired by Nuance in 2005. 
Google introduced experimental searching by voice in the early 2000’s (Franz & Milch, 2002).   In 2006, Google released GOOG-411, a speech recognition service (Schuster, 2010).  This service was a computerized version of directory assistance.  In 2008, Google released Voice Search in the United States.  This was similar to GOOG-411 other than it used the phone’s data network to transfer the audio instead of the phone’s audio line.
Apple released the iPhone 4S on October 24, 2011 (Murph, 2011).  The iPhone 4S included a new feature named Siri (Apple, 2011).  Siri was marketed as a digital personal assistant.  Siri used Nuance’s technology to power Siri (Beasley, 2014).
This example demonstrates the power of research.  Dr. Baker built upon the work of Doctors Baum and Petrie to create the foundation for technology that changed the way we interact with cell phones.  Hidden Markov models have been used for a multitude of classification problems such as DNA sequencing (Nelson, Foo, & Weatherspoon, 2008), image processing (Kalavathy & Suresh, 2010),  stock market prediction (Somani, Talele, & Sawant, 2014), and fraud detection (Iyer, Mohanpurkar, Janardhan, Rathod, & Sardeshmukh, 2011).
This is one example of how a research enriches our lives.  It is a gradual process.  This example spans fifty years.  It reinforces the material covered in the text, and delivered elsewhere.  There is a web of knowledge.  A researcher looks for something that is not there, but should be.  Doctors Baum and Petrie examined the work done previously on Markov chains and had insight into a new way to use them for classification.  Doctor Baker used the work on hidden Markov models to address the problem of computerized speech recognition.  The researchers at Google saw the problem of a user having to type search queries and developed a speech interface.  Apple made it mainstream.  So in answer to the question, is research important, ask Siri.

References
Baum, L. E., & Petrie, T. (1966). Statistical inference for probabilistic functions of finite state Markov chains. The Annals of Mathematical Statistics, 37(6), 1554-1563.  Retrieved from http://www.jstor.org/stable/2238772
Beasley, M. (2014). Why is Apple hiring Nuance engineers? Apparently to replace Siri’s Nuance-powered backend.   Retrieved from http://9to5mac.com/2014/06/30/why-is-apple-hiring-nuance-engineers-apparently-to-replace-siris-nuance-powered-backend/
Dragon. History of speech and voice recognition and transcription software.  Retrieved from http://www.dragon-medical-transcription.com/history_speech_recognition_timeline.html
Franz, A., & Milch, B. (2002). Searching the Web by voice. Paper presented at the Proceedings of the 19th international conference on Computational linguistics - Volume 2, Taipei, Taiwan.
Iyer, D., Mohanpurkar, A., Janardhan, S., Rathod, D., & Sardeshmukh, A. (2011, 11-14 Dec. 2011). Credit card fraud detection using Hidden Markov Model. Paper presented at the Information and Communication Technologies (WICT), 2011 World Congress on.
Kalavathy, S., & Suresh, R. M. (2010, 15-17 Dec. 2010). Image denoising using improved semantic approximation algorithm in wavelet domain hidden Markov model. Paper presented at the Signal and Image Processing (ICSIP), 2010 International Conference on.
Murph, D. (2011). iPhone 4S hands-on!   Retrieved from http://www.engadget.com/2011/10/04/iphone-4s-hands-on/
Nelson, R., Foo, S., & Weatherspoon, M. (2008, 16-18 March 2008). Using hidden Markov modeling in DNA sequencing. Paper presented at the System Theory, 2008. SSST 2008. 40th Southeastern Symposium on.
Rabiner, L. (1989). A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2), 257-286. doi:10.1109/5.18626
Schuster, M. (2010). Pricai 2010: Trends in artificial intelligence (M. A. O. Byoung-Tak Zhang Ed.): Springer.
Somani, P., Talele, S., & Sawant, S. (2014, 20-21 Dec. 2014). Stock market prediction using hidden Markov model. Paper presented at the Information Technology and Artificial Intelligence Conference (ITAIC), 2014 IEEE 7th Joint International.

Complex Event Processing for the Internet of Things

Ubiquitous connected sensing devices permeate our world.  These devices capture large amounts of data.  One challenge associated with this data is converting it to actionable insight.  This paper outlines the history of the Internet of Things and explores the application of rule-based complex event processing.

Internet of Things

In 1989, Tim Berners-Lee proposed what would become the World Wide Web (Berners-Lee, 1989).  He proposed organizing data as a web to help CERN, his employer, keep track of documents.  At the time, keyword searching required the manual registration of the keywords associated with a document.  A year later, in 1990, John Romkey demonstrated control of a Sunbeam Deluxe Automatic Radiant Control toaster over the Internet at the Interop conference (Malamud, 1992). The toaster was controlled using Simple Network Management protocol (SNMP).
Mark Weiser’s article on ubiquitous computing was published by Scientific American in 1991 (Weiser, 1991).  The article predicts that computers will fade into the background as they become part of the natural human environment.  The predictions in the article are more striking when we recall that at the time of the writing Microsoft Windows 3.0 had recently been released ("Microsoft Windows 3.0," n.d.).  The Trojan Room Coffee Pot images were streamed using the HTML image tag in 1993 (Gordon & Johnson, n.d.).   When the group connected the page to the Internet, it became the first webcam.
In 1997, Paul Saffo published that sensors would become the next key enabling technology (Saffo, 1997).  He argued that in the 1980s inexpensive microprocessor drove innovation.  Expensive lasers followed microprocessors, benefiting communications and storage.  He foresaw that sensors would be the next enabling technology.  In 1999, Kevin Ashton termed “Internet of Things” during a presentation to Proctor and Gamble (Ashton, 2009).  Ashton was attempting to explain radio-frequency identification (RFID) to a group of executivesA.
The Internet of Things (IoT) is an umbrella term used to describe connection of physical or virtual devices to the Internet (Bassi et al., 2013; Hui, Jiafu, Caifeng, & Jianqi, 2012; Singh & Singh, 2015; Tiburski, Albernaz Amaral, De Matos, & Hessel, 2015).  IoT solutions usually contain three core layers: sensing, network, and application (Cavalcante, Alves, Batista, Delicato, & Pires, 2015).  The sensing, or device, layer is concerned with capturing information.  The network layer is concerned with enabling communication between the sensing layer and the application layer.  Communication through this layer is often bidirectional.  The application layer is concerned with aggregation, event processing, or external communication.  One form of event processing related to the IoT is Complex Event Processing.

Complex Event Processing Background

The term Complex Event Processing (CEP) was coined by David Luckham in 1995 (Leavitt, 2009).  Luckham argued that CEP allowed for the creation of an abstraction hierarchy (D. C. Luckham & Frasca, 1998).   An abstraction hierarchy organizes a system’s activities and operations into a set of layers.  A popular application of this concept is the Open Systems Interconnection model.  Luckham was part of a project that created RAPIDE. 
RAPIDE contained a casual event pattern language (D. Luckham, 2002).  The language allowed for the control of processing flow.  It also specified operations such as filtering and aggregating.  The output of one set of operations could be directed to another.  This allowed for the creation of multiple layers of abstraction.  This was not the first time this sort of analysis was performed on streams of data, but previously that analysis required specialized programs or hand-coded solutions (Leavitt, 2009).
CEP is a subset of Event Stream Processing (ESP) (Weber, Lowe, Malunjkar, & Quinn, 2010).  CEP extended the single event stream processing concept of ESP to include multiple events with a temporal relationship.  Prior to CEP and ESP, real-time information was inserted into a relational database management system.  An inference engine evaluated the data within the database and detected the conditions to trigger an action.  The key distinction is that CEP and ESP do not require the persistence of the real-time information. 
Luckham’s initial implementation of CEP using RAPIDE represented the processing to occur as a set of nodes and arcs.  Because of the popularity of Structured Query Language (SQL) and the vendor experience with relational databases many CEP vendors selected SQL as the means of representing processing rules (Leavitt, 2009).  They retained the basic syntax while extending the language with required constructs. 
In recent systems, the Complex Event Processing pipeline is comprised of three phases (Mehdiyev, Krumeich, Enke, Werth, & Loos, 2015).  Filtering removes items that are not relevant from the processing flow.  Next, matching is attempted using the input from the previous step and the predefined rules.  The last step uses the output from the matching step to attempts to derive events that are more complex.
CEP is stateless, all processing is performed in-memory, and has been applied to machine-to-machine (M2M) communications (Bruns, Dunkel, Masbruch, & Stipkovic, 2015).  CEP is rule-based; this enables a high degree of flexibility when applied to M2M solutions.  M2M is a foundation technology for the IoT.

Application of CEP to IoT

Data produced by the Internet of Things is one of the fastest growing types of data (Barnaghi, Sheth, & Henson, 2013).  The devices that make up the IoT send data to the internet in a stream or in reaction to an event.  Because the devices producing and transmitting this continuous spatiotemporal data are low cost and operate in dynamic environments a continuous flow is not guaranteed.  The transmitted data may be aggregated and summarized, or might be at a low level.  The data the devices produce vary considerably in structure and complexity.

Current Trends

CEP continues to be a relevant technology.  The market is expected to be worth $4.95 billion by 2020 (MarketAndMarkets, 2016).  Microsoft released its version of CEP, named Azure Stream Analytics, in 2014.  Azure Stream Analytics uses an SQL-like syntax to represent the CEP rules.  TIBCO acquired StreamBase on June 11, 2013.  StreamBase is a CEP system that uses a language named StreamSQL EventFlow (StreamBase, 2016).  Oracle offers the Stream Explorer platform (Oracle, 2016).  WSO2 offers Complex Event Processor that is available on-premise or as part of WSO2’s Amazon hosted private cloud.  It uses an SQL-like language to define the event processing (WSO2, 2016).  This is by no means an exhaustive list of offerings.  It appears that most vendors offer a CEP solution. 
The IoT is not a clearly defined concept.  The underlying concepts of ubiquitous computing, inexpensive sensors, highly connected devices, and Big Data Analytics continue to grow and expand.  It is projected that the IoT market will reach $1.7 trillion by 2020 (IDC, 2015).
CEP and IoT are complementary technologies.  IoT produces a stream of data and CEP processes a stream of data.

Research Opportunities

CEP requires that rules developed by an expert in the field.  These rules are typically represented in an SQL-like language.  A subject matter expert usually creates the rules.  There have been several research efforts to auto-discover the rules (Lin, Wu, Huang, & Tseng, 2015; Mehdiyev et al., 2015).  These efforts are useful, but it might useful to make the representation of the rules more accessible.  SQL is a commonly known among developers and database administrators, but it is not a skill typically known by end users.  An effort to evaluate alternative ways to construct the CEP’s rules would be useful.  The system would represent the CEP rule logic such that it could be authored, displayed, edited, and used to produce a targeted CEP system’s rule representation.
Most IoT environments are heterogeneous in nature.  The same fields often have differing names, format, or units of measure.  In order to perform CEP on the generated data, it is useful for it to be standardized prior to processing.  A valuable research effort would be to consider a stateless preprocessing engine that accepted varying known formats and converted to a standardized output for ingestion by CEP.
Fog computing is a computing approach where processing occurs in the device or sensing layer of IoT (Díaz, Martín, & Rubio, 2016).  The motivation behind fog computing is that there are situations where cloud computing is not adequate.  Issues such as latency or location awareness require the processing as close as possible to the device.  CEP could be performed in the fog layer.  This could reduce bandwidth utilization by only sending information when an event was detected. 
IoT communications are comprised of two types of messages.  Sensor data flows from the device to a processing destination.  Data flows to the sensor to cause it to interact with its environment.  For example, a device might report a room temperature.  That data flows through the network and is processed by CEP.  CEP is used to produce actions based on a condition.  If the temperature was over a threshold, a message is sent to the device indicating the fan be turned on to increase the airflow into the room.  An investigation into a framework to integrate CEP output with IoT actuator flow.
IoT is being extended beyond the physical world to include virtual constructs (Ciciriello, Mottola, & Picco, 2006).  This can be viewed as creating an abstraction hierarchy.  The combination of virtual sensors and actuators combined an abstraction hierarchy of virtual sensors should be researched further.

Conclusion

The IoT is a producer of streams of data.  CEP is an efficient and popular way of processing streams of data.  The history of CEP is based on pattern matching and relates to database management systems.  The combination of CEP and IoT is an advantageous one. 

References

Ashton, K. (2009). That ‘internet of things’ thing. RFiD Journal, 22(7), 97-114.
Barnaghi, P., Sheth, A., & Henson, C. (2013). From data to actionable knowledge: Big Data challenges in the Web of Things. Intelligent Systems, IEEE, 28(6), 6-11. doi:10.1109/MIS.2013.142
Bassi, A., Bauer, M., Fiedler, M., Kramp, T., Kranenburg, R., Lange, S., & Meissner, S. (2013). Enabling things to talk: Designing IoT solutions with the IoT architectural reference model: Springer.
Berners-Lee, T. (1989). Information Management: A proposal.   Retrieved from https://www.w3.org/History/1989/proposal.html
Bruns, R., Dunkel, J., Masbruch, H., & Stipkovic, S. (2015). Intelligent M2M: Complex event processing for machine-to-machine communication. Expert Systems with Applications, 42(3), 1235-1246. doi:http://dx.doi.org/10.1016/j.eswa.2014.09.005
Cavalcante, E., Alves, M. P., Batista, T., Delicato, F. C., & Pires, P. F. (2015). An analysis of reference architectures for the Internet of Things. Paper presented at the Proceedings of the 1st International Workshop on Exploring Component-based Techniques for Constructing Reference Architectures, Montreal, QC, Canada.
Ciciriello, P., Mottola, L., & Picco, G. P. (2006). Building virtual sensors and actuators over logical neighborhoods. Paper presented at the Proceedings of the international workshop on Middleware for sensor networks.
Díaz, M., Martín, C., & Rubio, B. (2016). State-of-the-art, challenges, and open issues in the integration of Internet of Things and cloud computing. Journal of Network and Computer Applications. doi:http://dx.doi.org/10.1016/j.jnca.2016.01.010
Gordon, D., & Johnson, M. (n.d.). The story of the trojan room coffee pot: A timeline.  Retrieved from http://www.cl.cam.ac.uk/coffee/qsf/timeline.html
Hui, S., Jiafu, W., Caifeng, Z., & Jianqi, L. (2012, 23-25 March 2012). Security in the Internet of Things: A review. Paper presented at the Computer Science and Electronics Engineering (ICCSEE), 2012 International Conference on.
IDC. (2015). Explosive Internet of Things spending to reach $1.7 trillion in 2020 [Press release]. Retrieved from http://www.idc.com/getdoc.jsp?containerId=prUS25658015
Leavitt, N. (2009). Complex-Event Processing poised for growth. Computer, 42(4), 17-20. doi:10.1109/MC.2009.109
Lin, Y.-F., Wu, C.-W., Huang, C.-F., & Tseng, V. S. (2015). Discovering utility-based episode rules in complex event sequences. Expert Systems with Applications, 42(12), 5303-5314. doi:http://dx.doi.org/10.1016/j.eswa.2015.02.022
Luckham, D. (2002). The power of events (Vol. 204): Addison-Wesley Reading.
Luckham, D. C., & Frasca, B. (1998). Complex event processing in distributed systems. Computer Systems Laboratory Technical Report CSL-TR-98-754. Stanford University, Stanford, 28.
Malamud, C. (1992). Exploring the Internet: a technical travelogue: Prentice Hall.
MarketAndMarkets. (2016). Complex Event Processing (CEP) market worth 4.95 billion usd by 2020.   Retrieved from http://www.marketsandmarkets.com/PressReleases/complex-event-processing-cep.asp
Mehdiyev, N., Krumeich, J., Enke, D., Werth, D., & Loos, P. (2015). Determination of rule patterns in Complex Event Processing using machine learning techniques. Procedia Computer Science, 61, 395-401. doi:http://dx.doi.org/10.1016/j.procs.2015.09.168
Microsoft Windows 3.0. (n.d.).   Retrieved from http://www.oldcomputermuseum.com/os/windows_3.0.html
Oracle. (2016). Oracle Complex Event Processing Retrieved from http://www.oracle.com/technetwork/middleware/complex-event-processing/overview/index.html
Saffo, P. (1997). Sensors: The next wave of innovation. Communications of the ACM, 40(2), 92-98.
Singh, S., & Singh, N. (2015, 8-10 Oct. 2015). Internet of Things (IoT): Security challenges, business opportunities & reference architecture for E-commerce.Paper presented at the Green Computing and Internet of Things (ICGCIoT), 2015 International Conference on.
StreamBase. (2016). StreamBase.   Retrieved from http://www.streambase.com/
Tiburski, R. T., Albernaz Amaral, L., De Matos, E., & Hessel, F. (2015). The importance of a standard security architecture for SOA-based IoT middleware. Communications Magazine, IEEE, 53(12), 20-26. doi:10.1109/MCOM.2015.7355580
Weber, S., Lowe, H. J., Malunjkar, S., & Quinn, J. (2010). Implementing a real-time complex event stream processing system to help identify potential participants in clinical and translational research studies. AMIA Annual Symposium Proceedings, 2010, 472-476.  Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3041381/
Weiser, M. (1991). The computer for the 21st century. Scientific american, 265(3), 94-104.
WSO2. (2016). Complex Event Processor.   Retrieved from http://wso2.com/products/complex-event-processor/