550 727 01 |
|||
|
|||
|
Collecting Data Directly with a PCThe inclusion of the Data Acquisition (local) component within a Visual VIGO design, enables a databuffer of collected data to be generated on the hard disc of the local PC. Such a databuffer normally consists of records having fields associated with an identification number (IDNo), a date and time stamp (DateTime), a Value, and a field indicating the integrity of the data (ErrorCode).
Databuffer EngineThe databuffer is generated and managed by the PROCES-DATA Databuffer Engine (PDDE), which is invoked when any Data Acquisition components are included in a running Visual VIGO design. When running, the PDDE icon is shown on the task bar tray. The format of the databuffer generated by the PDDE is structured in a way that it cannot be edited, in order to protect the data from fraudulent alteration, but since the PDDE is an OLE server, such data can be used by other OLE client applications (e.g. Excel, Access, SQL). By default, each databuffer is stored locally within the VIGO folder located where the VIGO suite was initially installed, having a path to the databuffer folder of “VIGO\Dynamic Data\Visual VIGO\[Project Name]\[Variable PhysID].
Databuffer FilesWithin each databuffer folder, additional and consecutively numbered files are generated when the current file exceeds 52 Kbs of data, where the “Modified” attribute of the file indicates the date and time of the last update. Such a structure ensures high integrity of stored data and maximum speed of data search and the plotting of curves. Such a structure is automatically formed when the user chooses the PhysID of the variable when configuring the Variable Link property in the Data Acquisition component. Switching the Automatic Update property ‘On’ and choosing an Update Interval will regularly request data from the external variable defined in the PhysID field and stored in the databuffer. The Data Expired Interval (days) property ensures that the amount of data stored on the PC does not become excessive and/or unwanted. This is achieved by periodically testing the DateTime attribute of historical files, and deleting those that exceed the expiry period. Other DAQ components can now use this stored data for the plotting of curves and/or displaying data in a tabular form.
Sampling and CommunicationIn DAQ terminology, the frequency at which requests are made to read and store the value of a variable, is called the Sampling Time, where each captured measurement is called the Sample Value. With the use of the Data Acquisition (local) component, the Sampling Time is synonymous with the Update Interval property. Since the majority of variables are analogue representations of a real time measurement, it needs to be considered how often a sample needs to be taken, to ensure that an acceptable number of digitally stored values will provide a fair representation of the real signal over a period of time.
Obviously, the higher the frequency of sampling, the more accurate will be the digital representation. However, the consequence will be the need for more memory storage, leading to slower search times, and a heavier load on the communication network, ultimately leading to delays in other parts of the system. On the other hand, it should be questioned as to whether it is really necessary to take samples so often, especially when the measurement may be changing very slowly, if at all, for long periods of time. It is up to the system designer to decide on how to set the sampling time (Update Time) as a compromise between obtaining less frequent individual sample values and reducing the load, or whether it is important to know the value of the variable more frequently. For example, consider an ambient temperature measurement, which may change its value very slowly. It would be quite acceptable to take a Sample every minute or longer, and still get a true representation of the trend of the temperature change over a period of time. On the other hand, if it is important to log a variable that changes value significantly quickly over a period of time, the sample time needs to be faster. There are additional facilities that can be used to ensure that only relevant sampling and logging of data is undertaken, and that the efficiency of plotting the trend of a measured signal is as high as possible. This requires a lot more processing power than can be reasonably expected to be provided by the local PC, especially if there are many variables to log, Such facilities are provided by using the Remote Data Acquisition component in conjunction with the power provided by DataCollect Channels in a remotely located data logging DPI.
Related Topics |
|
|