Sunday, February 3, 2019

Methods of Moving the Robot with Kuka|prc

This post shows a few different ways to move a Kuka robot using Grasshopper and the Kuka|prc plug-in. The methods cover moving to a single points, and moving between parametrically generated points. Also presented is the ability to automatically lift up between motion. This is useful for robot methods which engage material. You may wish to lift up and start at a new position without touching the material during the traverse to the new point.

Kuka|prc and Planes

The software Kuka|prc is a plug-in to Grasshopper to control a Kuka industrial robot. PRC stands for Procedural Robot Control.

Kuka|prc moves the robot using planes. You can think of a plane as an X, Y, Z axis system and an origin. You can roughly think of this as being similar to the CPlane in Rhino. You can re-orient the CPlane so it is rotated to align differently. You can also move the origin of the plane so the 0,0,0 point is somewhere else in world space. Planes in use with PRC are like that. There are X, Y and Z axes, which all are perpendicular to one another, and which can be oriented any way you wish. And you can move the origin of the plane to any location you choose.

The great thing, because it is so elegant, is that the robot will follow the planes wherever you place them - within reason. You can move a plane outside the reach of the robot and it, of course, can't get there. And robots have limits to the joint motion. So some points are unreachable. But within the joint limits, if the robot can physically reach a location, putting a plane there in virtual space will make the robot move there, and face the direction of the plane.

A Simple Example

Moving the robot to a single point in space, facing a required direction, is the simplest motion we can make. The coordinate system of the tool is aligned with the plane. The point to move to is the origin of the plane.

Here's a simple Grasshopper definition to do this:

The robot code is the standard we always use. You can read more about that set up in Robot Programming with Kuka|prc V3. In the configuration above the tool of the robot has the Z axis coming out from the center of the tool tip.

The plane to move to is created using the XY Plane component. This plane faces upward. Note that a Point component is wired into the origin of the plane. This lets you easily move the origin by dragging the Gumball widget in the viewport. (Note: When you assign the point, choose Coordinate from the command line menu. Use of Coordinate is what enables the Gumballs to work. If you aren't seeing the Gumballs after you select the Point component use the Grasshopper pull-down menu Display > Gumballs.)

The XY Plane is wired into the Plane socket of the LIN Movement component. This component generates the command to move to the point.

What happens is the tool axis is aligned with the plane. You can see that below - the robot moves from the Start Point to the plane we defined. Since the plane's Z axis is up, the robot faces upward:

If you wanted the robot to face downward you'd need to invert the Z axis of the target plane. You can do that with the Adjust Plane component. Note in the example below the Normal of the plane is set to the Unit Z component and the Factor of Unit Z is set to -1. That makes the Z vector point downward.

Now when the robot faces this plane it points down:

List of Procedurally Generate Points (follow a Sine Wave)

In this example the robot moves between a series of points. The points are generated with the math function sine. Here's the bit of the definition which generates those points:

A series of values are generated using the Series component. These are values for the X coordinate of the point. The X values are run through the Sine component. This generate the Y values. Those are wired into the Construct Point component to generate the full 3D point (Z defaults to 0).

Next these points need to be scaled up from very small to the size need to move the robot between them. They also need to be moved to a position the robot can reach. Here's that bit of the definition:

The Scale component is used to scale the points. Since they are generated starting at the origin, 0,0 is left as the default center to scale about. That way they simply move away from the center as they scale up. Next the points are moved to a position within range of the robot. This is done by the Move component. It takes the geometry to move (the list of points) and a vector to translate with. The vector is constructed using the Vector 2 Pt component. It generate a vector between the two specified points. The first is the origin (set as the value in socket A) and the second is a point the user can control. That let's the user simply select the Point component and drag that point in the viewport to place the sine curve wherever desired.

Next the points are converted to planes. This is done in the same way as the previous example, having the robot face downward by reversing the Z axis of the plane to point down using Adjust Plane with a negative Z axis. Then the planes are wired into the Linear Movement component to generate the list of linear motion commands.

Finally, the movement command list is wired into the Kuka Core component to simulate the motion and generate the code.

The result is the robot moves in straight lines between the points on the parametric sine curve.

Using Data Trees to Traverse between Planes

The next example is a definition to draw a Voronoi diagram. Let's say for the sake of example into a clay surface. The robot end-effector is a "pin tool" - a needle point tool to scratch into the surface. The definition is fairly simple. The key to it is the Safe Plane component of Kuka|prc. It allows the robot to move up between etching each curve into the surface.

Here's a Voronoi diagram draw over the points which generate it. That set of points is generated randomly in this example. For each point there is a corresponding region consisting of all points closer to that point than to any other. The diagram is the boundaries of those regions.

Here's an image showing the robot traversing between drawing the curves:

Looking at the definition, first up is to generate the curves of the diagram. Here's that bit:

The Voronoi component on the right takes a set of points as input. These are generated randomly in a rectangular region using the Pop2D component. Voronoi also takes a rectangular boundary for the cells as well as a plane. The rectangular region is generated by the Rectangle component. As we've seen several times before, the plane has a Point as input. This is the origin of the plane and the user can drag that points around in the viewport to locate the diagram to proximity of the robot.

Next up is converting the points to planes. This is similar to what was covered above with one important difference. The Voronoi diagram is a set of curves (a list of them in Grasshopper). The first thing we need to is generate the points at the corners of each curve. This is done with the Control Points component. For each input curve it will output the control points. Very important to note here is that the output of Control Points is a data tree. You can see it is a data tree by the wire - it is shown as a dashed line. It's called a data tree because that's an easy way to visualize it - each curve is a branch in a tree. Each branch contains the points for that one curve. The next branch has the points for that next curve.

As before the points are wired into a XY Plane component which is inverted using Adjust Plane.

The planes are wired into the Linear Movement component as before. The new part is the use of the Safe Plane component. It takes the motion commands as input. For each branch of the tree it generates additional movement commands to move above the first point in the branch and above the last points in the branch. It moves up to a specified "safe plane". This is a plane the user defines above the height of the diagram points. That essentially lifts the tool out of the work and above it to traverse over to the next curve to draw:

The key is that it does this independently for each branch in the data tree. Since each branch contains all the points to draw a single Voronoi cell, lifting up after and above the next point works perfectly.

Here again is the image showing the traverses (light gray) above the cells in the diagram:


Friday, February 1, 2019

Remote Control Light Baton Tool

I made a new tool for use on the robot for doing light paintings. This one can be controlled from a phone or tablet app. It has 144 RGB LED lights. This post is the documentation on using it.

Hardware

The tool is made from baltic birch plywood and some electronics. The micro-controller is an Adafruit Feather 32u4 Bluefruit LE. The strip of LEDs are NeoPixels from Adafruit. There is also a board to move between the 3.7V battery and the 5V required by the pixels.

Mounting on the Robot

On the East robot, in the src\meier folder you'll find a program called AxesA6AtZero. Run this all the way to the end (past the first halt getting to the start point). That sets the rotation of A6 correctly so the tool can be installed vertically. Here's how it looks installed:


Software

To control the tool from your phone you need the Adafruit Bluefruit LE Connect app which is available for Android or Apple iOS devices and is free.

To configure it follow these steps:

Turn the tool on using the small switch inside. Off is towards the back, on towards the front. When on the LED lights will appear in a dim blue color initially.

Once you power on the tool you'll see the tool's name appear in the app. It's called Adafruit Bluetooth LE. Press the Connect button.

The app supports controlling many devices. For our needs we want the one called Controller:


From the next screen choose Control Pad:

The Control Pad user interface is shown below. Really intuitive and attractive, right?! Yes, it's a generic UI I have no control over so I can't label the buttons. Thus it is, uh, dreadful.

Here's a labelled diagram:

The number buttons (1-4) control the mode. You can choose between:
  1. Solid color
  2. Gradient of colors
  3. A moving gradient where the colors slide along the length of the tool
  4. A moving set of random colors which slide along the tool. 



The arrows control how the modes function.
  • Pressing the Up arrow changes the hue when you are in Solid color mode. 
  • The Left arrow changes the count of lights. It cycles between 144->72->36->18->9. 
  • The Right arrow controls the speed of the moving effects. You can cycle between 100 and 10 milliseconds between updates to the motion. 
  • The Down arrow controls the intensity of the lights. You can cycle between 16 different levels of brightness to the pixels. 

Here's a video which demonstrates these modes:
>

You can also enable the Quaternion control. This lets you tip your device back and forth to change colors.

From the Control Pad screen use the "< Controller" button on the top of the screen to go back. From that screen toggle on Quaternion.

This mode stays enabled until you turn it off. So you may go back to the Control Pad screen and use those controls as well.

When Quaternion mode is enabled you can rotate the phone or tablet to control the color in Solid color mode. Having the phone level in front of you is Red. Tipping the phone up towards vertical cycles between the colors: Orange -> Yellow -> Green -> Cyan-> Blue -> Magenta. When fully vertical the color is Red again.

In Gradient mode you can rotate the phone to move the colors along the length of the tool.

When the Quaternion mode is on the Hue and Speed buttons has no effect. Count and Intensity work.

Here's a video showing it in use:


Recharging

When you need to recharge the tool plug in a micro USB cable to the end of the tool which also has the on/off switch. Then plug the USB into a power source.

A full charge will last at least 3 to 4 hours of continuous operation.

Note: There is a similar connector on the other side of the tool however that is for programming it, not charging it. Be sure to plug into the one with the on/off switch. 

The cable and USB socket can be found in the tool cabinet:


Sunday, January 20, 2019

Single Light Tool Control

I made a new light tool for use with the robot. With this new tool you can change the color interactively using your phone. This post is the documentation on using it.

Hardware

The tool is made from baltic birch plywood and some simple electronics. The micro-controller is an Adafruit Feather 32u4 Bluefruit LE. The LED is a single NeoPixel.


You turn the tool on and off with the switch on top - off is to the left, on is to the right:

The battery lasts a long time (many hours of use). If you need to charge is simply connect the tool to a USB port on the computer. Here's the port on the tool:


When charging a yellow LED will blink. When fully charged the LED goes off. Note: Make sure the tool is on, otherwise it doesn't charge even if connected. 

Software

To control the tool from your phone you need a special app. The Adafruit Bluefruit LE Connect app is available for Android or Apple iOS devices and is free.

To configure it controlling the light tool follow these steps:

Once you power on the tool you'll see its name appear in the app. It's called Adafruit Bluetooth LE. Press the Connect button.

Once connected the tiny blue LED lights up.

The app supports controlling many devices. For our needs we want the one called Controller: 


From the next screen choose Color Picker:

Once the color picker appears you can simply press the "Send selected color" button to immediately update the LED on the tool. To turn the LED off, send black by dragging the Luminance slider all the way to the left.


If you turn the tool off or close the app they will automatically disconnect.

Robot Programming with Kuka|prc V3

This post provides information on setting up a Grasshopper definition using Kuka|prc V3.

KUKA|prc is a set of Grasshopper components that provide Procedural Robot Control for KUKA robots (thus the name PRC). These components are very straightforward to use and it's actually quite easy to program the robots using them.

Terminology

Before we begin discussing KUKA|prc it's important to clarify some terminology that will be used in this topic.
  • Work Cell: All the equipment needed to perform the robotic process (robot, table, fixtures, etc.)
  • Work Envelope: All the space the robot can reach.
  • Degrees of Freedom: The number of movable motions in the robot. To be considered a robot there needs to be a minimum of 4 degrees of freedom. The Kuka Agilus robots have 6 degrees of freedom. 
  • Payload: The amount of weight a robot can handle at full arm extension and moving at full speed.
  • End Effector: The tool that does the work of the robot. Examples: Welding gun, paint gun, gripper, etc.
  • Manipulator: The robot arm (everything except the End of Arm Tooling).
  • TCP: Tool Center Point. This is the point (coordinate) that we program in relation to.
  • Positioning Axes: The first three axes of the robot (1, 2, 3). Base / Shoulder / Elbow = Positioning Axes. These are the axes near the base of the robot. 
  • Orientation Axes: The other joints (4, 5, 6). These joints are always rotary. Pitch / Roll / Yaw = Orientation Axes. These are the axes closer to the tool. 

Rhino File Setup

When you work with the robots using KUKA|prc your units in Rhino must be configured for the Metric system using millimeters. The easiest way to do this is to use the pull-down menus and select File > New... then from the dialog presented chose "Small Objects - Millimeters" as your template.

The KUKA|prc User Interface

When installed KUKA|prc has a user interface (UI) much like other Grasshopper plug-ins. The UI consists of the palettes in the KUKA|prc menu.


There are five palettes which organize the components. These are:
  • 01 | Core: The main Core component is here (discussed below). There are also the components for the motion types (linear, spline, etc.). 
  • 02 | Virtual Robot: The various KUKA robots are here.  
  • 03 | Virtual Tools: Approach and Retract components are here (these determine how the robot should move after a toolpath has completed). There are also components for dividing up curves and surfaces and generating robotic motion based on that division. 
  • 04 | Toolpath Utilities: The tools (end effectors) are here. We'll mostly be using the Custom Tool component.  
  • 05 | Utilities: The components dealing with input and outputs are stored here. These will be discussed later. 
  • 06 | MX: These components are used with MX Automation, a way to control the robots in real-time. 

KUKA|prc CORE

The component you always use in every definition is called the Core. It is what generates the KUKA Robot Language (KRL) code that runs on the robot. It also provides the graphical simulation of the robot motion inside Rhino. Everything else gets wired into this component.

The Core component takes five inputs. These are:
  • SIM - This is a numeric value. Attach a default slider with values from 0.00 to 1.00 to control the simulation. 
  • CMDS - This is the output of one of the KUKA|prc Command components. For example a Linear motion command could be wired into this socket. 
  • TOOL - This is the tool (end effector) to use. It gets wired from one of the Tool components available in the Virtual Tools panel. Usually you'll use the KUKA|prc Custom Tool option and wire in a Mesh component will show the tool geometry in the simulation.  
  • ROBOT - This is the robot to use. The code will be generated for this robot and the simulation will graphically depict this robot. You'll wire in one of the robots from the Virtual Robot panel. For the Agilus Workcell you'll use the Agilus KR6-10 R900 component.  
  • COLLISION - This is an optional series of meshes that define collision geometry. Enable collision checking in the KUKA|prc settings to make use of this. Note that collision checking has a large, negative impact on KUKA|prc performance. 
There are two output as well:
  • GEO: This is the geometry of the robot at the current position - as a set of meshes. You can right-click on this socket and choose Bake to generate a mesh version of the robot for any position in the simulation. You can use this for your own renderings. 
  • ANALYSIS: This provides detailed analysis of the simulation values. This has to be enabled for anything to appear. You enable it in the Settings dialog, Advanced page, Output Analysis Values checkbox. Then use the Analysis component from the Utilities panel. For example if you wire a Panel component into the Axis Values socket you'll see all the axis values. 

Settings

The gray KUKA|prc Settings label at the bottom of the Core component gives you access to its settings. Simply left click on the label and the dialog will appear.

The settings are organized into pages which you select from along the top edge of the dialog (Settings, Advanced, and Analysis). The dialog is modeless which means you can operate Rhino while it is open. To see the effect of your changes in the viewport click the Apply button. These settings will be covered in more detail later.

Basic Setup

There is a common set of components used in nearly all definitions. Not surprisingly, these correspond to the inputs on the Core component. Here is a very typical setup:

  • SIM: The simulation Slider goes from 0.000 to 1.000. Dragging it moves the robot through all the motion specified by the Command input. It's often handy to drag the right edge of this slider to make it much wider than the default size. This gives you greater control when you scrub to watch the simulation. You may also want to increase the precision from a single decimal point to several (say 3 or 4). Without that precision you may not be able to scrub to all the points you want to visualize the motion going through.
    You can also add a Play/Pause component. This lets you simulate without dragging the time slider. 
  • CMDS: The components which gets wired into the CMDS slot of the Core is really the heart of your definition and will obviously depend on what you are intending the robot to do. In the example above a simple Linear Move component is wired in.
  • TOOL: We normally use custom tools. Therefore a Mesh component gets wired into the KUKA|prc Custom Tool:Plane component. This gets wired into the TOOL slot of the Core. The Mesh component points to a mesh representation of the tool drawn in the Rhino file. See the section below on Tool orientation and configuration. 
  • ROBOT: The robot frequently use in the course is a KUKA KR30/60/HA. So that component is chosen form the Virtual Robots panel. It gets wired into the ROBOT slot of the Core.
  • COLLISION: If you want to check for collisions between the robot and the workcell (table) wire in the meshes which represent the workcell. As noted above this has a large negative impact on performance so use this only when necessary. 
  • SAVE: This optional socket is exposed by right-clicking the Core component and choosing "Expose Filename and Save Input". You wire in a Button component. Pressing the button sets this input to True and saves the file to the filename you provide below. 
  • NAME: This optional socket is exposed by right-clicking the Core component and choosing "Expose Filename and Save Input". This is the filename to save to. Enter the name only - the directory is set in the Settings dialog. Important Note: Avoid any special characters and don't use numbers at the beginning of the filename. Also note this name needs to be unique across the entire robot file system. Files with the same name in different folders are still a problem! 

Robot Position and Orientation

You need to define how the robot is positioned and oriented. For example a robot can be mounted on a gantry and hung upside down. You need to let PRC know about this!

KR-60 East

For the East KR-60 robot at Taubman College you need to define the Base to specify the correct position and orientation. You simply set the Base No. to 3 as shown below. You leave all the other values at 0. That's it. Note that forgetting to set your base to 3 will immediately give you an error when you run your program. 

Agilus KR-6 Robots

The Agilus workcell has two robots named Mitey and Titey. Depending on which one you are using you'll need to set up some parameters so your simulation functions correctly. These parameters specify the location and orientation of the robot within the workcell 3D model.

If you don't have the latest version, see below for how to set them up. 

Mitey

Mitey is the name of the robot mounted in the table. Its base is at 0,0,0. The robot is rotated about its vertical axis 180 degrees. That is, the cable connections are on the right side of the robot base as you face the front of the workcell.

To set up Mitey do the following:

Bring up the Settings dialog by left clicking on KUKA|prc Settings label on the Core component. The dialog presented is shown below:

You specify the X, Y, and Z offsets in the Base X, Base Y, and Base Z fields of the dialog. Again, for Mitey these should all be 0. In order to rotate the robot around the vertical axis you specify 180 in the Base A field. You can see that the A axis corresponds to vertical in the diagram.
  • Base X: 0
  • Base Y: 0
  • Base Z: 0
  • Base A: 180
  • Base B: 0
  • Base C: 0
After you hit Apply the robot position will be shown in the viewport. You can close the dialog with the Exit button in the upper right corner.

Titey
The upper robot hanging from the fixture is named Titey. It has a different X, Y and Z offset values and rotations. Use the settings below when your definition should run on Titey.

Note: These values are all in millimeters.
  • Base X: 1102.5
  • Base Y: 0
  • Base Z: 1125.6
  • Base A: 90
  • Base B: 180
  • Base C: 0

Code Output

The purpose of KUKA|prc is to generates the code which runs on the robot controller. This code is usually in the Kuka Robot Language (KRL). You need to tell KUKA|prc what directory and file name to use for its code output. Once you've done this, as you make changes in the UI, the output will be re-written as necessary to keep the code up to date with the Grasshopper definition.

To set the output directory and file name follow these steps:
  • Bring up the Settings dialog via the Core component. 
  • On the main Settings page, enter the project filename and choose an output directory. Do not start the filename with a number and do not use these characters  < > : " / \ | ? *

Note: You can also expose the filename input on the Core component. Simply right-click on the Core and choose "Expose Save and Filename Inputs". If you do that the filename is set outside this dialog. However the directory specified here is used.

That's all you need to do to generate code.

See the topic Taubman College Agilus Workcell Operating Procedure for details on how to get the code onto the robot and run it.

Start Position / End Position

When you work with robots there are certain issues you always have to deal with:
  • Reach: Can the robot's arms reach the entire workpiece?
  • Singularities: Will any joint positions result in singularities? (See below for more on this topic) 
  • Joint Limits: During the motion of the program will any of the axes hit their limits? 
One setting which has a major impact on these is the Start Position. The program needs to know how the tool is positioned before the motion starts. This value is VERY important. That's because it establishes an initial placement for the joint limits. Generally, you should choose a start position that doesn't have any of the joints near their rotation limits - otherwise your programmed path may cause them to hit the joint limit. This is a really common error. Make sure you aren't unintentionally near any of the axes limits. Also, the robot will move from it's current position (wherever that may be) to the start position. It could move right through your workpiece or fixture setup. So make sure you are aware of where the start position is, and make sure there's a clear path from the current position of the robot to the start position. In other words, jog the robot near to the start position to begin. That'll ensure the motion won't hit your set up.

You specify these start and end position values in the Settings of the Core. Bring up the settings dialog and choose the Advanced page.

Under the Start / Endposition section you enter the axis values for A1 through A6. This begs the questions "how do I know what values to use?".

You can read these directly from the physical robot pendant. That is, you jog the robot into a reasonable start position and read the values from the pendant display. Enter the values into the dialog. Then do the same for the End values. See the section Jogging the Robot in topic Taubman College Agilus Workcell Operating Procedure.

You can also use KUKA|prc to visually set a start position and read the axis values to use. To do this you wire in the KUKA|prc Axis component into the Core component. You can "virtually jog" the robot to a specific position using a setup like this:

Then simply read the axis values from your sliders and enter these as the Start Position or End Position.

Another way is to move the simulation to the start point of the path. Then read the axis values from the Analysis output of the Core Settings dialog. You can see the numbers listed from A01 to A06. Jot these down, one decimal place is fine. Then enter them on the Advanced page.

Initial Posture

Related to the Start Point is the Initial Posture setting. If you've set the Start Position as above and are still seeing motion (like a big shift in one of the axis to reorient) try the As Start option. This sets the initial posture to match the start position.

Motion Types

KUKA|prc provides several motion types. These are Point to Point, Linear, Circular, or Spline. This section presents the differences between the motions and the components and settings used to get them in your definitions.

See the post Robot Motion Analysis Using Light for a visual display of the motion types.

Note: The information in this section contains material excerpted from the KUKA documentation.

PTP: Point to Point

The robot guides the TCP along the fastest path to the end point. The fastest path is generally not the shortest path and is thus not a straight line. As the motions of the robot axes are rotational, curved paths can be executed faster than straight paths. The exact path of the motion cannot be predicted.

You get this motion type by using the KUKA|prc PTP Movement component.



You use the right-click menu on this component to choose the interpolation settings. Choose No Interpolation or C_PTP.

LIN: Linear

The robot guides the TCP at a defined velocity along a straight path to the end point. This path is predictable.

You get this motion type by using the KUKA|prc Lin Movement component.



You use the right-click menu on this component to choose the interpolation settings. Choose No Interpolation, C_DIS or C_VEL.

CIRC: Circular

The robot guides the TCP at a defined velocity along a circular path to the end point. The circular path is defined by a start point, auxiliary point and end point.

You get this motion type by using the KUKA|prc Cir Movement component.



You use the right-click menu on this component to choose the interpolation settings. Choose No Interpolation, C_DIS or C_ORI.

SPLINE: Smooth Spline 

The robot will move along the positions in a smooth spline motion.

You get this motion type by using the KUKA|prc Spline Movement component.


You use the right-click menu on this component to choose the interpolation settings. Choose No Interpolation, C_DIS or C_ORI.

Approximate Positioning - Interpolation Settings

In order to increase velocity, points for which exact positioning is not necessary can be approximated. The robot essentially takes a shortcut.

All the movement types are affected by interpolation settings. These can be turned off or enabled via right-click menus on the movement components - linear movement options are shown below:

The values used for interpolation are set on the Advanced page of the Core Settings:

Motions with Approximate Positioning

Interpolation affects the way the robot smooths movement. Without interpolation, the robot will briefly stop at each position. The interpolation values affect at which point the robot starts to interpolate – either at a certain distance from the target point, or at a certain percentage between the start and target point. Generally, a higher value leads to smoother movement but less accuracy.

Approximate positioning is activated by entering values for CDIS, CVEL, or CORI. The larger the values in CDIS, CVEL, or ORI, the earlier the approximate positioning begins. In certain circumstances, the system may shorten approximate positioning, but will never lengthen it.

The approximate positioning motion is automatically generated by the controller. To make approximate positioning possible, the value for Advance Run must be at least 1. 

CDIS: A distance in mm can be assigned to the interpolation setting CDIS. In this case the controller leaves the path, at the earliest, when the distance from the end point falls below the value in CDIS.

CVEL: A percentage value can be assigned to the interpolation setting CVEL. This value specifies the percentage of the programmed velocity at which the approximate positioning process is started, at the earliest, in the deceleration phase of the motion. The path will be altered in order to maintain the specified percentage of the programmed velocity.

CORI: An orientation percentage can be assigned to the interpolation setting CORI. In this case, the path is left, at the earliest, when the dominant orientation angle (swiveling or rotation of the longitudinal tool axis) falls below the angle percentage, defined in CORI.

CPTP: The following is taken from the Kuka Expert Programming Guide: For the purposes of PTP approximate positioning, the controller calculates the distances the axes are to move in the approximate positioning range and plans velocity profiles for each axis which ensure tangential transition from the individual instructions to the approximate positioning contour. Uhhh... say what now?! How about this: The greater the value of CPTP, the more the path is rounded!

See the post Robot Motion Analysis Using Light for a visual display of the motion types.

Tool Setup

Correct setup of tools is essential. The dimension and orientation of the tool needs to be set in KUKA|prc as well as on the robot controller. The values need to match - and a mismatch is a very common source of problems. By matching what is meant is an ID number is assigned to the tool in KUKA|prc and the same values must be set in the corresponding tool ID on the robot controller.

Understanding tool setup is really understanding the coordinate system they are based on. The coordinate system uses the right-hand rule. Using your right-hand (!) position your fingers perpendicular to one another as shown below.  The X axis is in the direction of the thumb, the Y axis is the index finger, and the Z axis is along the middle finger.

The default orientation of this coordinate system is aligned on the tool plate as follows: The +X axis come directly perpendicular to the tool plate. The +Z axis is perpendicular and goes up. The +Y axis is perpendicular to the other two.

This orientation of this coordinate system can be easily changed in the tool definition. In all the samples used below the tool is defined with +Z coming out from the tool plate.

Tool Mesh

When you 3D model the tool do so at the world origin and such that the tool Z axis is aligned with world Z. In the case of the Agilus workcell the origin is at the base of the robot, Mitey.

As an example here's the Axis Teach Tool. It shows the orientation of the robot using Red (X axis), Green (Y axis) and Blue (Z axis) visually. If you jog in Tool Mode you can see the robot slide along red, green or blue dowel axes.

Here's how you'd 3D model this tool in Rhino. The tool is located at world 0,0,0, with +Z going up. World 0,0,0 is right at the base of the robot:


Note that the tool mount plate needs to be modeled as part of the tool. It is a cylinder, 10.5mm (0.413") thick and 88.9mm (3.5") in diameter.

Note that the tool needs to be a mesh. A single mesh - so use the Mesh command to convert the NURBS geometry to a mesh. Then if necessary use MeshBooleanUnion to generate a single mesh of all the parts.

Custom Tool-Plane Setup

Here's how you setup the the tool inside Kuka|prc. Use the Custom Tool-Plane component (available in the Virtual Tool panel). Use the WorldXY as the plane. Use a Mesh component to retrieve the single mesh. Use an Integer or Panel component to provide the tool ID number. These all get wired into the Custom Tool - Plane.

Note how the tooltip shows the resulting transformation you use at the physical robot. These numbers are simply the values are required to change the coordinate system from the default (+X perpendicular to the tool face plate) to your desired one (+Z perpendicular to to the tool face plate). A simple rotation about the Y axis does this. Thus the tool definition is a rotation about B (Y axis) of 90 degrees.

In this case an offset from the world origin is used. In the teach tool example, you want the origin to be at the center of rotation for the three axes. This is 20mm above the base of the tool. In this way, when you rotate the tool in Tool Mode, it will revolve around the center of each axis.

You can see the resulting offset you enter at the robot, again, using the tool tip. Note that Z is now 20mm.

You enter these numbers X 0, Y 0, Z 20, A 0, B 90, C 0 into the tool ID of 6 using the physical robot pendant. See the section Installing a Tool in Taubman College Agilus Workcell Operating Procedure for details on how to enter these values.

Working with Planes

See the topic Working with Planes in Kuka |prc for details on using planes in your definitions.


Singularities

A singularity results from the collinear alignment of two or more robot axes which causes unpredictable robot motion or unexpected velocities in the motion. For this reason, motion paths that make the robot pass near singularities should be avoided.

KUKA robots with 6 degrees of freedom have 3 different singularity positions.
  • Overhead singularity
  • Extended position singularity
  • Wrist axis singularity

Overhead

In the overhead singularity, the wrist root point (intersection of axes A4, A5 and A6) is located vertically above axis 1.

Extended position

In the extended position singularity, the wrist root point (intersection of axes A4, A5 and A6) is located in the extension of axes A2 and A3 of the robot. The robot is at the limit of its work envelope.

Wrist axes

In the wrist axis singularity position, the axes A4 and A6 are parallel to one another and axis A5 is within the range ±0.018°.