{"id":148,"date":"2025-10-08T10:08:30","date_gmt":"2025-10-08T08:08:30","guid":{"rendered":"https:\/\/intelliprops.fhwn.ac.at\/?page_id=148"},"modified":"2025-10-08T11:05:08","modified_gmt":"2025-10-08T09:05:08","slug":"results-ap5","status":"publish","type":"page","link":"https:\/\/intelliprops.fhwn.ac.at\/?page_id=148","title":{"rendered":"Results WP5"},"content":{"rendered":"\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Introduction<\/h2>\n\n\n\n<p>Demonstrators allow technologies, principles, concepts, or ideas to be made tangible and showcased in practice. Explanations thus become accessible to a target audience, and the feasibility of the object demonstrated can be proven. For companies, the major advantage is that application examples can be observed live, allowing them to better assess applicability for their own operations. Another important advantage is that the development of demonstrators results in significant know-how gains, since key experiences can already be gained that are important for later implementation in a production environment. The goal of demonstrators is also to bring the demonstrated environment as close as possible to a realistic scenario, which further increases relevance. For this purpose, industrial or industry-related software systems (e.g., ERP systems, MES systems, etc.) are used.<\/p>\n\n\n\n<p>The tasks in work package 5 include the technical planning of the demonstrators, the procurement of necessary components, the setup of demonstrators for both make-to-order and series production, as well as commissioning and testing of the demonstrators. For this purpose, two systems are used. A robotic cell (Figures 1 and 2) with a six-axis single-arm robot of the type ABB IRB 1100-4\/0.58 was put into operation to depict single-item production. Here, in an autonomous cell, a robotic gripper (Figures 3 and 4) is produced.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1920\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-1-3-scaled.jpg\" alt=\"\" class=\"wp-image-194\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-1-3-scaled.jpg 2560w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-1-3-300x225.jpg 300w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-1-3-1024x768.jpg 1024w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-1-3-768x576.jpg 768w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-1-3-1536x1152.jpg 1536w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-1-3-2048x1536.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><figcaption class=\"wp-element-caption\">Figure 1: Robotic Cell<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1920\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-2-1-scaled.jpg\" alt=\"\" class=\"wp-image-195\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-2-1-scaled.jpg 2560w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-2-1-300x225.jpg 300w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-2-1-1024x768.jpg 1024w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-2-1-768x576.jpg 768w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-2-1-1536x1152.jpg 1536w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-2-1-2048x1536.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><figcaption class=\"wp-element-caption\">Figure 2: Robotic Cell<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"768\" height=\"1024\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-3-768x1024.jpg\" alt=\"\" class=\"wp-image-179\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-3-768x1024.jpg 768w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-3-225x300.jpg 225w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-3-1152x1536.jpg 1152w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-3-1536x2048.jpg 1536w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-3-scaled.jpg 1920w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><figcaption class=\"wp-element-caption\">Figure 3: Overview of components<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"2560\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-4-scaled.jpg\" alt=\"\" class=\"wp-image-183\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-4-scaled.jpg 1920w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-4-225x300.jpg 225w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-4-768x1024.jpg 768w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-4-1152x1536.jpg 1152w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-4-1536x2048.jpg 1536w\" sizes=\"auto, (max-width: 1920px) 100vw, 1920px\" \/><figcaption class=\"wp-element-caption\">Figure 4: Robotic gripper<\/figcaption><\/figure>\n\n\n\n<p>The second system is a Festo FMS50 didactic system (Figure 5). The system is modern and uses industrial components, including Siemens S7 controllers. This system can produce pneumatic cylinders in a wide variety of versions (see Figure 6). For both systems, an open-source ERP system (Odoo) was implemented to plan and control production.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1920\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-5-1-scaled.jpg\" alt=\"\" class=\"wp-image-198\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-5-1-scaled.jpg 2560w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-5-1-300x225.jpg 300w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-5-1-1024x768.jpg 1024w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-5-1-768x576.jpg 768w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-5-1-1536x1152.jpg 1536w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-5-1-2048x1536.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><figcaption class=\"wp-element-caption\">Figure 5: FESTO FMS50 System<\/figcaption><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"605\" height=\"254\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-6.png\" alt=\"\" class=\"wp-image-186\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-6.png 605w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-6-300x126.png 300w\" sizes=\"auto, (max-width: 605px) 100vw, 605px\" \/><figcaption class=\"wp-element-caption\">Figure 6: Pneumatic cylinder<\/figcaption><\/figure>\n\n\n\n<p><strong><br><\/strong><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Overview Demonstrators<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">D1D Primary Requirements Planning<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">Initial situation<\/h4>\n\n\n\n<p>Primary requirements planning is the basis for all planning. Depending on the estimated demand, material is procured, or resources (human\/machine) are reserved. Classical forecasting models are often difficult to calculate or rarely used, as their application is not very intuitive for users. Within the IntelliProPS project, our goal was therefore to make the application of AI-supported quantitative forecasting as simple and intuitive as possible.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Objective<\/h4>\n\n\n\n<p>The demonstrator aims to show planning departments of companies with variant-rich series production how AI methods can be used to forecast demand quantities and thus production quantities (finished products) for upcoming planning periods. Planning horizons vary between companies and are strongly dependent on the respective product, related lead times, and quantities. The demonstrator simulates the typical planning process in variant-rich series production. For this purpose, a history of sales figures is simulated, forming the data basis. Building on this, a likely demand curve for the next planning period \u2013 in this case quarterly &#8211; &nbsp;is predicted, and production quantities for sub-periods are estimated.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Technical Implementation<\/h4>\n\n\n\n<p>The demonstrator was set up within the IT infrastructure of the InnovationLab at the University of Applied Sciences Wiener Neustadt. The parts considered are robotic grippers, assembled by an ABB robot. Resource planning is carried out using the open-source ERP system Odoo, version 16. The Odoo instance runs locally on a server, allowing extensive customizations as required by this project.<\/p>\n\n\n\n<p>To generate synthetic data, the D15S simulator was used, which enables the simulation of different demand patterns. For selected products, higher demand was simulated, while the majority of variants followed stochastic patterns. In total, over 80,000 data records were generated.<\/p>\n\n\n\n<p>As a forecasting model, LightGBM (<a href=\"https:\/\/lightgbm.readthedocs.io\/en\/stable\/\">https:\/\/lightgbm.readthedocs.io\/en\/stable\/<\/a>) was chosen because it enables fast training of many models and handles large datasets very well. The developed AI model was seamlessly integrated into the existing Odoo ERP system. The logic and model training were implemented directly in Python, using Odoo\u2019s existing folder structure and classes. The ML model accesses the PostgreSQL database (which stores system data including transaction and master data) via the psycopg2 library. Tests showed that this approach was significantly faster than using Odoo\u2019s API. The data is preprocessed, a corresponding model is trained, and the visualization is directly integrated into the XML code that handles Odoo\u2019s UI rendering.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Results<\/h4>\n\n\n\n<p>The material master in Odoo was extended with two functions (Figure 7). The first function allows training an AI model directly for a given material. Training was optimized so that the process takes only a few seconds. Users can thus easily train or retrain specific AI models without additional expertise. With the second implemented function, the trained model can then be used to forecast demand for the coming weeks. Current data is pulled directly from the ERP system, preprocessed automatically without manual intervention. Validation showed that, given sufficient data, the system can forecast demand for one year with an error of less than 0.6%. The demonstrator successfully proves that seamless integration of AI models into ERP systems is possible without major difficulties, provided the ERP system offers the necessary structures. Depending on the amount of training data, the AI models achieved very good RMSE values.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"716\" height=\"383\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-7.jpg\" alt=\"\" class=\"wp-image-187\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-7.jpg 716w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-7-300x160.jpg 300w\" sizes=\"auto, (max-width: 716px) 100vw, 716px\" \/><figcaption class=\"wp-element-caption\">Figure 7: Screenshot Odoo<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1361\" height=\"476\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-8.png\" alt=\"\" class=\"wp-image-188\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-8.png 1361w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-8-300x105.png 300w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-8-1024x358.png 1024w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-8-768x269.png 768w\" sizes=\"auto, (max-width: 1361px) 100vw, 1361px\" \/><figcaption class=\"wp-element-caption\">Figure 8: Forecast<\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">D9D Detection and Classification of Components<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">Initial situation<\/h4>\n\n\n\n<p>In production environments, incorrect identification of components can potentially have serious consequences. Resulting incorrect bookings distort stock levels in the ERP system, which may lead to overstock or stockouts. Component recognition using computer vision is an important tool in industrial manufacturing that automatically and precisely identifies and analyzes components or parts in a production process. This technology uses image processing algorithms to extract visual information from images or videos and identify specific components. Normally, component recognition relies on image processing techniques such as thresholding, masking, etc. However, conventional methods are often unsuitable for industrial environments. Changes in lighting conditions (brightness variations or shadowing) or variations in components can impair recognition accuracy. Movements of people or objects in the detection area can also cause distortions or errors. These environmental factors challenge the reliability and consistency of component recognition.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Objective<\/h4>\n\n\n\n<p>To overcome these challenges, an AI-based object detection model was developed that can identify the components of a robotic gripper. The introduced robotic gripper serves as a complex, variable object consisting of three main components: a base, a coupling, and fingers. The base can be either rounded or square, the coupling is available in three different colors, and the number of fingers can be either two or three, with the fingers themselves available in three different lengths (40, 60, or 80 mm).<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Technical Implementation<\/h4>\n\n\n\n<p>Convolutional Neural Networks (CNNs) require a large amount of training data. The training data was collected as part of a bachelor thesis and annotated using the software makesense.ai. Photos were taken with a Baumer camera, which was also permanently mounted on the system. For image classification, Yolo version 8 from Ultralytics was used\u2014the latest model in the series at the start of the project. The trained model was saved as a .pt file. YOLO was chosen because it can be implemented on less powerful PCs due to its low computational requirements. Normally, the model would continuously output the detected classes per image, analyzing an image every 20\u201350 ms on high-performance PCs. In a production environment, however, this would cause the undesired effect of continuously returning detected classes. To address this, recognition is triggered by light barriers. As soon as a part triggers the light barrier, the conveyor belt is stopped by the controller. The D9D model is then started, and the detected class is returned exactly once. If it is a known class, the conveyor restarts until the next component is detected. The recognized class can be transferred to the ERP system to dynamically assign the component to a storage location.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Results<\/h4>\n\n\n\n<p>Within this demonstrator, a machine vision model (YOLOv8) was trained for component recognition and directly linked with the ERP system (Odoo). Recognized material can now be automatically booked into and out of storage. This avoids issues associated with conventional automated booking systems such as barcodes (susceptible to dirt, requires part marking) or RFID (costly implementation, misidentification with many components). The experiments also showed that the implementation method is just as important for system performance as the AI model itself. In a test series, a booking accuarancy of 100% was achieved.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1008\" height=\"765\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-9.jpg\" alt=\"\" class=\"wp-image-189\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-9.jpg 1008w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-9-300x228.jpg 300w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-9-768x583.jpg 768w\" sizes=\"auto, (max-width: 1008px) 100vw, 1008px\" \/><figcaption class=\"wp-element-caption\">Figure 9: Detection and Classification of Components<\/figcaption><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">D11D Live-Inventory<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">Initial situation<\/h4>\n\n\n\n<p>Most Austrian manufacturing companies are required to carry out an annual inventory. Conducting an inventory is not only labor-intensive and time-consuming, but manual inventories are also prone to error. Deviations of around 2% are not uncommon. Especially in low-volume and low-variance production environments, real-time data is essential to avoid bottlenecks and ensure smooth production. Although traditional systems such as Enterprise Resource Planning (ERP) support inventory tracking, they still require regular inventories and manual recording of goods movements. More advanced technologies such as Warehouse Management Systems (WMS), enhanced picking aids, and RFID-based systems offer improvements but are often associated with high investment costs, making them unaffordable for many SMEs. While manual inventories remain indispensable in complex warehouses (high-bay storage, cabinets, etc.), AI methods can be used in simpler storage environments (e.g., workplace storage).<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Objective<\/h4>\n\n\n\n<p>The goal was to develop a cost-effective and efficient inventory management system tailored specifically to the needs of small and medium-sized manufacturing companies. This system should provide real-time inventory data while minimizing the use of expensive technologies. By leveraging AI algorithms and low-cost sensors, accurate inventory information can be provided to avoid bottlenecks and ensure uninterrupted production.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Technical Implementation<\/h4>\n\n\n\n<p>A comparison of different approaches to recording inventory levels (e.g., GPS systems, RFID, barcodes, etc.) showed that machine vision models offer the most promising solution. Hardware is inexpensive to acquire, software systems are mostly open source, and they do not require high investment costs. Convolutional Neural Networks (CNNs) in particular promise high accuracy in component recognition. For image classification, Yolo version 11 from Ultralytics was used\u2014the latest model in the series at the time of model development. The trained model was saved as a .pt file. YOLO was chosen due to its low computational requirements, making it suitable for implementation on less powerful PCs. To create the training dataset necessary for the AI model, a comprehensive set of images was collected as part of a bachelor thesis and annotated using the Roboflow software solution. A Baumer camera (Figures 10 and 11) was installed to continuously capture images of the storage area. Continuous monitoring enables actual stock levels to be recorded. The recognized classes (materials) and their quantities can be directly transferred to an ERP system, in this case Odoo. If a discrepancy between actual and target stock is detected, a correction booking can be triggered automatically.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Results<\/h4>\n\n\n\n<p>The demonstrator successfully proved the combination of machine vision and ERP systems for monitoring stock levels. However, some shortcomings were identified regarding the AI model\u2019s performance. While some components could be recognized with very high accuracy and precision (up to 100%), recognition accuracy decreased significantly for similar variants of a component (especially the \u201cBase\u201d component). This indicates an insufficient training dataset, which will need to be addressed in the future.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1344\" height=\"1344\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-10.jpg\" alt=\"\" class=\"wp-image-190\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-10.jpg 1344w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-10-300x300.jpg 300w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-10-1024x1024.jpg 1024w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-10-150x150.jpg 150w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-10-768x768.jpg 768w\" sizes=\"auto, (max-width: 1344px) 100vw, 1344px\" \/><figcaption class=\"wp-element-caption\">Figure 10: Live Inventory Robotic Cell<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"884\" height=\"667\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-11.jpg\" alt=\"\" class=\"wp-image-191\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-11.jpg 884w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-11-300x226.jpg 300w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-11-768x579.jpg 768w\" sizes=\"auto, (max-width: 884px) 100vw, 884px\" \/><figcaption class=\"wp-element-caption\">Figure 11: Live Inventory FESTO FMS50<\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">D12D Progress Monitoring<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">Initial situation<\/h4>\n\n\n\n<p>In the first phase of the IntelliProPS project, a series of expert interviews were conducted with Austrian SMEs to identify relevant problem areas. One of the most significant problems that emerged was the lack of transparency in productions, especially concerning the progress of production orders. In variant-rich single-item production, where products are manufactured individually, customer-specific, and in small quantities, it is important to precisely track the status and position of a production order. This is usually done manually or semi-automatically. Manual progress monitoring is prone to human error, as it relies on manual entries. In addition, there are often delays in updating information, which can negatively impact responsiveness to changes in the production process.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Objective<\/h4>\n\n\n\n<p>The demonstrator aims to illustrate the effectiveness of progress monitoring in single-item production through the integration of camera systems with artificial intelligence (AI), programmable logic controllers (PLC), and Enterprise Resource Planning (ERP) systems. When a production order is completed, a signal can be sent to the PLC in order to initiate the next order, ensuring a seamless transition between production steps. At the same time, a signal can be sent to the ERP system to update the current production status and synchronize the corresponding databases for further planning. The goal is a transparent representation of production orders, where the status of orders can be monitored in near real-time in the ERP system.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Technical Implementation<\/h4>\n\n\n\n<p>For detecting the actual status of a production order or workpiece, a machine vision model was used. The advantage of machine vision models is that they can not only determine the progress of an assembly but also detect defects at the same time, meaning that progress monitoring and final inspection can be combined into one model. Here too, a YOLO model (You Only Look Once) was chosen due to the ability to implement such models on less powerful PCs without significant performance losses. As soon as a component is detected, the assembly progress is captured. The class recognized by the YOLO model defines the status of the production order and its quality indicator (OK or NOK).<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Results<\/h4>\n\n\n\n<p>The demonstrator showed that production orders can be monitored, and progress directly reported back to the ERP system to support planners in their work and ensure they always operate with up-to-date information. Out of 18 defined classes, 10 were recognized with 100% precision, while the remaining classes achieved 98% precision<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"926\" height=\"721\" src=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-12.jpg\" alt=\"\" class=\"wp-image-192\" style=\"width:840px;height:auto\" srcset=\"https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-12.jpg 926w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-12-300x234.jpg 300w, https:\/\/intelliprops.fhwn.ac.at\/wp-content\/uploads\/2025\/10\/Figure-12-768x598.jpg 768w\" sizes=\"auto, (max-width: 926px) 100vw, 926px\" \/><figcaption class=\"wp-element-caption\">Figure 12: Progress Monitoring<\/figcaption><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Introduction Demonstrators allow technologies, principles, concepts, or ideas to be made tangible and showcased in practice. Explanations thus become accessible to a target audience, and the feasibility of the object &#8230;<\/p>\n","protected":false},"author":5,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-148","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/intelliprops.fhwn.ac.at\/index.php?rest_route=\/wp\/v2\/pages\/148","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/intelliprops.fhwn.ac.at\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/intelliprops.fhwn.ac.at\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/intelliprops.fhwn.ac.at\/index.php?rest_route=\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/intelliprops.fhwn.ac.at\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=148"}],"version-history":[{"count":5,"href":"https:\/\/intelliprops.fhwn.ac.at\/index.php?rest_route=\/wp\/v2\/pages\/148\/revisions"}],"predecessor-version":[{"id":199,"href":"https:\/\/intelliprops.fhwn.ac.at\/index.php?rest_route=\/wp\/v2\/pages\/148\/revisions\/199"}],"wp:attachment":[{"href":"https:\/\/intelliprops.fhwn.ac.at\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=148"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}