SmartRobot

SmartRobot

This research project is executed by Ruben van den Brink and will focus on implementing a smart robot in the vacuum infusion process. This implies further developing of the research that is done in fibre placement and visual inspection. The first research in automation of the vacuum infusion process was done during the RAAK-MKB program Robocompo. This program was delivered in 2016. Now a new subsidised RAAK-MKB program for automation research is focusing on First Time Right production of composites, which requires a solution for defects during the vacuum infusion process which need to be solved automatically. The main objective during this research is to modify the robot with sensors and a control system in such a way that it contributes to first-time-right manufacturing of the vacuum infusion process.

This is my project plan which will be the basis of my research:Project Plan Smart Robot.

Update 09-11-2017

Connected the robot with LabVIEW Digimetrix robotic library. The vision system can detect angles and compare them with each other. The resulting angle can be loaded in the robot operation. This led to the following demonstrator. Have fun!

 

Update 30-11-2017

Robot is currently not operational so I’m currently testing the vision system. Hopefully is the robot operational again next week so first steps of programming a demonstrator can be taken. Today I shot a small video which explains two functions of the vision system: Detecting the presence of a fabric and calculating coordinates of the corners and the center of the fabric. Have fun!

Update 14-12-2017:

This week another step closer to an active controlled robot. This week mounted the vision system to the pick-and-place end-effector. This resulted in active control while picking up a carbon laminate. This way the exact position of the pick location does not have to be known when picking and placing this laminate. Made a video to demonstrate the process. Have fun!

Update: 19-12-2017:

Today I made a combination of localization of a ply and the move the robot into a close-up position. This close-up position will result in a constant and optimal quality of the pictures of the plies. The higher the quality, the better the defect detection. Have fun with the video!

After a long period of hard work to finish my graduation project is the final stage finally achieved. The defect detection of defects in the plies is implemented in the robot operation. Multiple open-loop and closed-loop controls are used in the total process of placing a ply in the mold. The first detection of the ply is open loop. The location (input) is checked by the vision system and compared to the desired input (a ply in the desired starting position). The deviation between the actual input and the desired input is send to the robot to pick up the ply. This is open-loop control. The defect detection process is open-loop control. The input (the ply that is placed in the detection area) is compared to a desired input (ply without defects). The outcome of the defect detection process determines the following sequence of the process. The final stage of the process is the placement in the mold. In this process closed-loop control is applied. The output of the placement (orientation) is compared to the desired output (ply with correct placement). The deviation from this desired output is send to the robot to correct the placement. After the correction the orientation is checked again. This loop continues until the orientation is between the set boundaries. In the following movie the final demonstrator is explained.

 

Leave a Reply