Some 30 years ago the hype was: MAP (Manufacturing Automation Protocol). One of the next hype is "Fog Computing".
"The Manufacturing Automation Protocols (MAP) and Technical Office Protocols (TOP) were the first commercially defined and accepted functional profiles. Both arose because of the operational concerns of two large corporations, General Motors and Boeing. lt is generally accepted that MAP and TOP were the forerunners, first in adopting OSI standards and then in developing usable profiles.
lt all started at the end of the
1970s. GM had on its manufacturing plant shop floors some
20 000 programmable controllers,
2000 robots, and
more than 40 000 intelligent devices, all in support of its business. The main problem was that less than one-eighth of the equipment could communicate beyond the limits of its
own island of automation; the main inhibiting factor to greater integration being the lack of an appropriate communications infrastructure. As devices supplied were mostly vendor-specific, to do a particular job, they were
not designed or optimised to intercommunicate or support each other's functions.
GM finally realised the gravity of their situation when they began to evaluate the cost of automation, attributing half the cost to the need for devices to intercommunicate. To resolve the matter a task force was created comprising representatives from GM's divisions and their suppliers, with the objective of developing an independent computer network protocol capable of supporting a true multi-vendor environment on the shop floor. They used the OSI model and standards as a basis for interconnection and development of further enhancements. " (Source: The Essential OSI, NSW Technical and Further Education Commission 1991)
The first MAP Profile was published in 1982, Version 1.0 in 1984, MAP 3.0 in 1988. Long time ago!
The MAP approach was understood by just a few experts. Most people believed that MAP was too complex, too ... The
fieldbusses were thought as the solutions that could cover a kind of Mini-MAP and realtime communication. MAP passed away and hundreds of fieldbusses have been developed since the late 80s. The result was that myriads of automation islands hit the factory floor. These islands where bridged with OPC and so on ... Now we write 2016! Is there anything new?
Not that much. We still have the problem that the
sheer unlimited number of (usually raw) signals (measurements, status, settings, ...) are polled or pushed from the sensor and actuator level all the way up to the SCADA level or even higher. This approach of signal acquisition does not scale in the future where we expect thousand of times more devices, sensors, controllers, ... as GM had to manage in the 70s. Does the Cloud Computing solve this challenge? It is unlikely that this (more or less raw data acquisition) will work?
And now? What to do? Use Fog Computing!
"
Fog computing is the missing link to accelerate IoT. It spans the continuum from Cloud to Things in order to bring compute, control, storage and networking
closer to where the data is being generated.
The sheer breadth and scale of IoT solutions requires collaboration at a number of levels, including hardware, software across edge and cloud as well as the protocols and standards that enable all of our “things” to communicate. Existing infrastructures simply
can’t keep up with the data volume and velocity created by IoT devices, nor meet the low latency response times required in certain use cases, such as emergency services and autonomous vehicles. The strain on networks from cloud-only or cloud-mostly models
will only get worse as IoT applications and devices continue to proliferate. In addition, the devices themselves are starting to become smarter, allowing for additional control and capabilities closer to where the data is being generated." (
http://www.openfogconsortium.org/about-us/#frequently-asked-questions)
Quite interesting that the hype Cloud Computing is seen from a different perspective in 2016.
The approach of IEC 61850 (starting in 1998) is from the very beginning the same as discussed in the Fog Computing community: Compute, control, store, and networking
closer to where the data is being generated (at THE process level like in substations or power generation all over). Many information models standardized in IEC 61850 and IEC 61400-25 define distributed functions like protection, active power control or reactive power compensation ... schedules for tariffs, alarming, tripping, reporting by exception (RBE), ... in order to reduce the needed bandwidth and allow for realtime and near realtime behavior.
Lesson learned: Fog Computing is already practiced in the domain of power automation - and based on well defined standards (IEC 61850 and IEC 61400-25)! Both standard series make use of the most crucial standard of MAP:
MMS (Manufacturing Message Specification, ISO 9506). It took some 30 years for more people to understand the challenges! ;-) There is nothing new under the sun.