# X, Y and Z positions and angle positions.

The UR robot is a 6 axis robot so the calculation of the robot coordinates is a complex equation that involves rotation vectors as well. It might seems simple to say “I just want the robot to go to a X, Y Z position – yes but then there are many different Pose the robot can have at that exact  X, Y, Z position – for example should the head be from the left – or right – or in an angle or upside down from below – they can all reach the exact same X, Y Z position. Therefore the angle or rotation of the tool head can be different and therefore thrown into the equation.

For the better understanding then focus on the X, Y and Z position first.

Just consider the robot in an X, Y, and Z coordinate system.

This shows the robot at X = 0 mm, Y = 430 mm and Z = 400 mm. Easy to see in the coordinate system. And this can also be seen on the Information window in the Move screen.

In this case the X, Y, Z coordinates is shown with the base as reference.

Don’t worry too much about the Rx, Ry and Rz yet – they will be explained later. But it is important to understand and to sink in that  – the X, Y and Z coordinates are seen from the Base position in this case (can actually be changed, but that will complicated the explanation) – whereas the Rotation of the Tool head is seen from the tool head position itself. This is paramount to understand otherwise it is very easy to get confused.

Maybe consider your arm – the finger tip position is the X, Y and Z coordinates from your body – whereas the Rotation of your wrist is the Rx, Ry and Rz – with the X, Y and Z axis shifted to your fingertip.

## X, Y, Z vector versus Joint angles.

This part of the screen shows the Joint angle of each joint. This does not directly tell where the robot head is position in the space. Note each is expressed in degrees.

Each joint can turn + 360 degrees and -360 degrees i.e. a total of 720 degrees.

The part of the screen shows the tool Position as a 6 axis vector with the X, Y, Z position in mm seen from the base as reference.

Whereas the tool head rotation is seen as angles expressed in Radians as seen from the tool head point as reference.

As seen above the Joint angles are shown in degrees, but this could also be expressed in Radians. And this is important to notice because later when used in script programming – these joint angles has to be provided in Radians.

### Formula for converting Radians to degrees on UR and degrees to Radians.

(Radian / 3.14) x 180 = The Degree shown in the Move Screen.

3.14 x (The Degree shown in the Move Screen) / 180 = Radian

# RX, RY and RZ rotation vector movement.

The position of RX, RY and RZ is a Rotation Vector based on the physics of the robot i.e. the length of the arms and joints and the rotation of the joints.

The calculation of the tool head position is equations where these factors are a part. The normal tool head position facing down is 180 degree from the origin of these vectors. So to illustrate the changes to RX, RY and RZ then rotate the tool head to face straight up so all RX, RY and RZ can be 0.

Put the robot in this pose i.e. X = 0, Y = 400 mm, Z = 600 mm, RX = 0, RY = 0, RZ =0

Robot Pose.

The RX, RY and RZ can be changed Individually by these controls.

The RX, RY and RZ can also be changed by key in the value of the desired angle.

Current value for RX.

Type in the desired angle in radians and click OK.

Press Auto the perform the move.

The robot tool head has turned around the X axis.

Using the Arrow keys for the tool head.

Set the robot in position X = 0, Y = 400 mm, Z = 600 mm, RX = 0, RY = 0, RZ =0

Press one of the X rotation arrows.

The robot has turned around the X axis

Set the robot in position X = 0, Y = 400 mm, Z = 600 mm, RX = 0, RY = 0, RZ =0

Press one of the Y rotation arrows.

The robot has turned around the Y axis

Set the robot in position X = 0, Y = 400 mm, Z = 600 mm, RX = 0, RY = 0, RZ =0

Press one of the Z rotation arrows.

The robot has turned around the Z axis (Not easy to see on this graphics, but only the last joint with the connector has turned.

(Better to see on a physical robot).

When the robot tool head is facing down the RX is turned 180 degree i.e. 3.14 radians.

When the robot head is turned around the Y axis then RX and RZ is changing value because the entire equation involve the 180 Degree offset of the tool head.

## Move to X, Y, Z, Rx, Ry, Rz position.

In these examples the Feature is shown from a “Base” perspective i.e. set the Feature to “Base” view.

The X, Y and Z are the position of the robot tool head in mm as in a coordinate system.

Rx, Ry and Rz are the orientation of the Tool head as an angle in Radians around the axis mentioned after the “R” i.e. Rx is the angle around the X axis in Radians.

X = 0 mm, Y = 430 mm, Z = 400 mm

X = 0 mm                                       Y = 430 mm                                    Z = 400 mm

### Turn the robot tool around an Axis.

Observe how there is a Red, Green and Blue line below the robot graphics. The Red line illustrates the X axis, the Green line illustrates the Y axis and the blue line illustrates the Z axis. These entire Axes are seen from the tool head point.

On the left side of the screen under “Move Tool” is the same colour used on the arrows to indicate which axis is moved.

Pay attention to these arrows.
Focus now only on these arrows and press the Red arrow on the left side.

Turn the Tool head around the X axis (Rx).

Turn the Tool head back around the X axis (Rx).

Notice how the robot turn the tool head along the X axis.

Turn the Tool head back around the X axis (Rx)

Notice how the robot turn the tool head along the X axis.

Turn the Tool head around the Y axis (Ry).

Turn the Tool head back around the Y axis (Ry).

Notice how the robot turn the tool head along the Y axis.

Turn the Tool head back around the Y axis (Ry).

Notice how the robot turn the tool head along the Y axis.

Turn the Tool head around the Z axis (Rz).

Turn the Tool head back around the Z axis (Rz).

Notice how the robot turn the tool head along the Z axis.

(Not easy to see on the graphics because only the last joint turns).

Turn the Tool head back around the Z axis (Rz).

Notice how the robot turn the tool head along the Z axis.

(Not easy to see on the graphics because only the last joint turns).

The X, Y, Z, Rx, Ry, Rz will change when the tool head is manipulated according to the physics of the robots arms and body structure.

### Turn the robot tool around the Y axis (Ry).

Instead of using the arrows as manipulators, it is also possible to key in the angle of the tool head position. This is easiest to understand when the robot is aligned along the X and the Y axis. In first example the robot is aligned along the Y axis i.e. X = 0.

Press the Ry bar.                                             Change from 3.14                 to 2.00 Radians.

Press “Auto” button to perform the move. Ry now changed to 2.00 Radians.

Robot at Ry = 3.14 Rad.    Robot at Ry = 2.00 Rad.   Robot at Ry = 2.00 Rad

Change the robot back to Ry = 3.14 Radians.

Press the Ry button and set Ry to 3.14.

Try just a small change e.g. set the Ry to 3.1. Notice how the robot did the move immediately without the need for pressing “Auto” because of the small change.

Set the robot back to Ry = 3.14.

Next example will show the robot aligned along the X axis i.e. Y = 0. Change the Robot to the Y = 0 position.

Use the “Base” arrow bar to move the robot over to the Y = 0 position.

The robot swing over the position the toll head at the X axis i.e. Y = 0.

Notice how the Rx and Ry also change along the way because the tool head also move.

Change the Ry = 0.

Change the Rx = 3.14

Robot at Y = 0 and the tool head align with the X axis.

Press the Rx bar.    Change the Rx = 2.00.       Press “Auto” to perform move.

The tool head has now turned around the X axis.

Rotated around the X axis.

### Move the Robot by X, Y, Z positions via Script.

#### Move the tool head around the Y axis.

In this example the Robot is at X = 0, Y = 430 and Z = 400 with the tool head at Ry = 3.14 Radians.

Then this script command is send to the Robot.

s.send (“movej([-1.8263219632699421, -1.7319098497843228, 1.7991931614989278, -1.6389153321983159, -1.5723347175650684, 2.8868157860256334], a=1.3962634015954636, v=1.0471975511965976)” + “\n”)

time.sleep(10)

s.send (“movej(p[0.0000000000000000, 0.4300000000000000, 0.4000000000000000, 0.0000000000000000, 2.0000000000000000, 0.0000000000000000], a=1.3962634015954636, v=1.0471975511965976)” + “\n”)

Note:

The “p” (here highlighted) in front of the parameter change from Joint position move to a Pose move.

In this example the tool head is moved from Radian 3.14 to Radian 2.0 around the Y axis.

Notice how the X, Y and Z remain the same, but how the RY is changed from 3.14 to 2.00 by the script command below.

Using this description in the Script manual as Inspiration.

Notice hot he Tool head has turned around the Y axis.

#### Move the tool head along the Y axis.

It is also possible to move in a direction of an axis. In this example the robot is moved 200 mm in Y axis direction.

s.send (“movej([-1.8263219632699421, -1.7319098497843228, 1.7991931614989278, -1.6389153321983159, -1.5723347175650684, 2.8868157860256334], a=1.3962634015954636, v=1.0471975511965976)” + “\n”)

time.sleep(10)

s.send (“movej(p[0.0000000000000000, 0.4300000000000000, 0.4000000000000000, 0.0000000000000000, 2.0000000000000000, 0.0000000000000000], a=1.3962634015954636, v=1.0471975511965976)” + “\n”)

time.sleep(10)

s.send (“movej(p[0.0000000000000000, 0.6300000000000000, 0.4000000000000000, 0.0000000000000000, 2.0000000000000000, 0.0000000000000000], a=1.3962634015954636, v=1.0471975511965976)” + “\n”)

See the result below

Complete program for moving the robot 200 mm in Y axis. (Note the change from 0.43000 to 0.63000.

# Echo client program
import socket
import time

HOST = “192.168.0.9″    # The remote host
PORT = 30002              # The same port as used by the server
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.send (“set_analog_inputrange(0, 0)” + “\n”)
data = s.recv(1024)
s.close()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.send (“set_analog_inputrange(1, 0)” + “\n”)
data = s.recv(1024)
s.close()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.send (“set_analog_outputdomain(0, 0)” + “\n”)
data = s.recv(1024)
s.close()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.send (“set_analog_outputdomain(1, 0)” + “\n”)
data = s.recv(1024)
s.close()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.send (“set_tool_voltage(24)” + “\n”)
data = s.recv(1024)
s.close()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.send (“set_runstate_outputs([])” + “\n”)
data = s.recv(1024)
s.close()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
data = s.recv(1024)
s.close()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.send (“set_gravity([0.0, 0.0, 9.82])” + “\n”)
data = s.recv(1024)
s.close()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))

s.send (“movej([-1.8263219632699421, -1.7319098497843228, 1.7991931614989278, -1.6389153321983159, -1.5723347175650684, 2.8868157860256334], a=1.3962634015954636, v=1.0471975511965976)” + “\n”)

time.sleep(10)
s.send (“movej(p[0.0000000000000000, 0.4300000000000000, 0.4000000000000000, 0.0000000000000000, 2.0000000000000000, 0.0000000000000000], a=1.3962634015954636, v=1.0471975511965976)” + “\n”)

time.sleep(10)
s.send (“movej(p[0.0000000000000000, 0.6300000000000000, 0.4000000000000000, 0.0000000000000000, 2.0000000000000000, 0.0000000000000000], a=1.3962634015954636, v=1.0471975511965976)” + “\n”)

data = s.recv(1024)
s.close()

Disclaimer: While the Zacobria Pte. Ltd. believes that information and guidance provided is correct, parties must rely upon their skill and judgement when making use of them. Zacobria Pte. Ltd. assumes no liability for loss or damage caused by error or omission, whether such an error or omission is the result of negligence or any other cause. Where reference is made to legislation it is not to be considered as legal advice. Any and all such liability is disclaimed.

If you need specific advice (for example, medical, legal, financial or risk management), please seek a professional who is licensed or knowledgeable in that area.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

## 235 thoughts on “X, Y and Z position.”

1. Franziska

Hi,
Thank you for this. Do you know if there is a way to move the robot along a tool axis with urscript? Basically I need the same command the polyscope has in the “move” window with the Feature setting on “tool” but for remote control.
Thanks

Hi Franziska

Thanks for the question.

When using the urscript the robot move according to the base plane. I have not seen a function to move the robot according to the tool axis when using urscript.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

2. Jonas

Hi Lars,

on my UR10 I have configured a tool with a TCP. Now I want to align the y-Axis of my tool coordinate system with the x-axis of the base coordinate system, so I can realise a straight movement in the axis directions. Is there a simple solution to this and how do I do this inside an application program?

Greetings from Germany
Jonas

Hi Jonas

Thanks for the question.

I have not seen a command that can do that. Maybe some mathematical transformation is needed, but I dont have the equation.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

Hi Subashree

https://www.universal-robots.com/how-tos-and-faqs/how-to/ur-how-tos/real-time-data-exchange-rtde-guide-22229/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

3. Subashree

Hi,
while (True):
\$ 1 “Robot Program”
\$ 2 “Loop”
while (True):
\$ 3 “MoveJ”
\$ 4 “Waypoint_1″
movej([-0.41050144007068745, -1.7891077237847028, -0.9663539002439536, -1.906161354486664, 1.5692255305388803, -3.393017293763275], a=1.3962634015954636, v=1.0471975511965976)
\$ 5 “Waypoint_2″
movej([-0.5766172869857424, -2.1305601186065113, -1.7653676579396944, -0.7158618557111421, 1.5684401203912224, -3.393017293763275], a=1.3962634015954636, v=1.0471975511965976)
\$ 6 “Waypoint_3″
movej([-1.3032646253656317, -1.5563618298410513, -1.761089960765078, -1.5162149895933852, 1.5366515323962338, -3.393017293763275], a=1.3962634015954636, v=1.0471975511965976)
end
\$ 7 “Wait: 5.0″
sleep(5.0)
end

This is my sample program,in this code I have to write a coding for cycle counter ..can u help me ,how to write a code in that program

4. Subashree

Hi,I am new to this programming..I want to know how to write a sample robot program using wait and loop option?

Hi Subashree

Thanks for the question.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

5. raymond

I would like to control the UR5 to certain pose, I find when I apply ‘top plane’ pose, the end effector is not pointing down. What is wrong with the horizontal pose command?

vertical one (correct one) as shown in img:
movej(p[ 0.4, 0.0, 0.6, -0.020382145524755171, 1.5647458834840668, 0.014027805429229736])

horizontal one (UR seems wrong pose):
movej(p[0.4, 0.0, 0.6, -2.9780318745639898, 3.0743386623577078, -0.041448699170086881])

Thanks.

https://imgur.com/a/KlSXD

Hello raymond

Thanks for the question.

It might be the Rx, Ry, Rz data is the pose that is not correct.

On way to get the data is to put the robot in the desired pose – and then read the values in the “Move” screen to verify.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. raymond

Hi, Zacobria Lars Skovsgaard,
Please kindly refer to below pic, the pose is correct.
https://imgur.com/a/KlSXD
Much appreciated if you can spend little time to follow my unit test case, and let me know if you can duplicate the problem I faced, and provide some advice.

movej(p[0.4, 0.0, 0.6, -2.9780318745639898, 3.0743386623577078, -0.041448699170086881])

Many thanks for help.

6. Ekrem

Hi, sir.

I am using UR10 robot and a gyroscope. I am giving to robot rx = 45, ry=45 and rz=45 degrees from the pendant. My gyro is at the end of the robot. I am reading roll (around x axis), pitch (aorund y axis) and yaw (around z axis) angles from gyro. But gyro values and robot angle values are not match. Why is that happening? Is rx,ry and rz is different from roll pitch yaw?
I need answer soo much :(

Thank you.

Hi Ekrem

Thanks for the question.

The Rx, Ry and Rz angles shown in the Move screen is mainly shown in radians.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

7. Jonathan

Hi
Apologies if this has already been answered… I have a UR3 with cables going along the arm to the wrist. Obviously the cables will get stretched if the wrist turns around too many times, but my program uses variables to control the angle and sometimes the shortest point between two angles results in a continued rotation in the ‘wrong’ direction.
Is there any way to limit the rotation of the last joint, so that cables can’t become twisted?
Many thanks,
Jonathan

Hi Jonathan

Thanks for the question.

In the Safety menu under Joint Limits and Position Range the Wrist 3 can be restricted.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

8. layeb

Hi,

I have a RG2 gripper fixed on a UR10 robot. I noticed that when i rotatee the wrist 3 of a 180° angle, the X and Y positions remain the same, but at the same time ans as i can see it, the position of the gripper changes. Is this normal or the cause could be that the gripper was attached in an inclined position.

Hi layeb

Thanks for the question.

I am not sure what you mean by “remain the same”.

When the tool head is rotated with the gripper mounted on the tool head – then the tool head corinates follow the turn.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. layeb

Hi Lars and thank you for your quick response, what i meant is that rotating the tool head around the z axis doesn’t change the x,y coordinates, which is logical, but what i noticed is that the gripper position shifts. To be more sure, ii marqued the initial gripper position before movement, and after rotating the tool head 180°, it clearly changed, and when i rotated it another 180° it returned to the initial position. I hope i could make it clearer.

Hi layeb

Do you have any setting the the TCP configuration ?

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

Hi layeb

Is the shift due to mechanical position and mounting of the gripper.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

9. Douglas Mejia

Hello,

How would I go about the position of a single joint using the UR script?

For example if all I wanted to do was rotate the base 45 degrees what script command could I use?

Many thanks,
Douglas Mejia

Hello Douglas

Thanks for the question.

Maybe consider this method.

Program
Robot Program
MoveJ
Wait: 0.01
Waypoint_1
var_1≔get_actual_joint_positions()
var_2≔var_1[0]
Wait: 1.0
var_2≔var_2+0.785
var_1[0]=var_2
var_1

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

10. Pasi

Hey!

How can I calculate the hole circle coordinates with UR robot?
The purpose was to tell the robot the number of holes on the perimeter and the radius of the circle, whereby the robot calculates the position of the points in X and Y direction in relation to the center point and moves the tool to the calculated point.

Hey Pasi

Thanks for the question.

The robot has a range of trigonometry functions such as

acos()
asin()
atan()
cos(f)
d2r()
sin()

And then using math it might be possible to create an equation for your task.

Or it might be possible to use the function “pose_trans” to turn the toolhead the number of degrees the represent the portion of degrees based on the number of holes in the full circle – and then maybe let the toolhead X axis line up with the radius so when giving the radius the robot move that amount along the toolhead X axis with a degree increment based on the number of points on the full circle.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Pasi

Thank you for the quick reply.

I made a simple calculation with the robot, x = 22 * sin (22.5), the result is -10.717, the correct answer would be 8.419. What did I do wrong?

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

Hey Pasi

Notice that the “sin” argument needs to be in radians.

With this program the variable “var_2″ gets the value 8.419

Program
BeforeStart
var_1≔0
var_2≔0
Robot Program
Wait: 0.01
var_1≔d2r(22.5)
var_2=22*sin(var_1)

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

11. Safa

Hi Lars,

I am trying to turn the head of a UR10 around the Z axis by giving it an angle value through a tcp/ip connexion. What i noticed is that rx,ry and rz change simultaniously, so when i send him an rz value, it takes a weird position that has nothing to do with what i wanted initially. Is there something i can do to fix this issue.

Hi Safa

Thanks for the question.

Notice there is a difference in the orientation of the Z axis – whether it is seen with the robot Base as reference – or if it is seen with the robot Tool head as reference. And the actual robot position (current robot pose) also has an influence on how many joints are turning in order to make the move around the Z axis using the Rz parameter.

If the tool head is pointing straight down:

Then if the robot is turning around the Z axis using the Rz parameter with the Base as reference – then only the last joint is turning because the Toolhead is in line with the Robot base Z direction which is straight (vertically) up and down.

But if the tool head is in an angle:

Then if the robot is turning around the Z axis using the Rz parameter with the Base as reference – then several joints might turn – because the TCP point is turning around the Z axis – which might be a big circle in order to follow around the vertical Z axis which is still straight up/down. It is the TCP point that is following the Z axis.

This move can be done by using the “Speedl” command.

Instead if the robot is to turn around the Tool heads Z axis then the command “pose_trans” might be considered.

Below is a program example on both the different commands – first the “Speedl” is used – and thereafter the “pose_trans” is used.

If the Toolhead is straight up/down when starting the program – then with both commands only the last joint is turning.

But if the Tool head is in an angle when starting the program – the turning result is very different.

Program
Robot Program
Wait: 1.0
speedl([0,0,0,0,0,0.1], 1.2,2.0)
Wait: 2.0
var_1≔get_actual_tcp_pose()
var_2≔p[0,0,0,0,0,-0.5]
var_3≔pose_trans(var_1,var_2)
MoveL
var_3

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Safa

Thnak you very much for the quick reply,

but i wanted to ask if rx=0, ry=3.14 and rz=0 can be considered as a “pointing down” position.

Hi Safa

Yes it is.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Safa

Hi Lars,

Thank you again, you are of a huge help.
So the speedl worked fantastically, and the head tool turned around the Z axis, but when i used the same variables you sent me ie speedl([0,0,0,0,0,0.1], 1.2,2.0), wrist3 turned with an angle of 11.23°=0,196 radian, is there a calibration to make.

Hi Safa

The Speedl parameters are not related to the angle. The data given – represent “speed” – “Acceleration” and “Time”. More informations can be found in the script manual.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

12. Baptist Vermeulen

hello,

i am a master student en for my master thesis i am trying to send via an socket communication a X,Y and Z coordinate an head rotation to the robot. the robot is used as an palletising robot so the head always need to point down, i know i need to use the moveL to move the robot lineair. but i am unable to figure out an good script for the robot where i can send 3 coordinates and an rotation to via the socket communication. the coordinates are send from an visual basic application i made.

does someone already have such script that i can see to know what i am doing wrong.?

thank you

Hello Baptist

Thanks for the question.

I am not familiar with Visula Basic, but here are some examples in Python and Polyscope.

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/script-from-host-to-robot-via-socket-connection/

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/testing-ur-client-server-and-script-commands/

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/planes-and-position-variables-1/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Baptist Vermeulen

hello, i try to send this command movej(p[0.00, 0.3, 0.2, 2.22, -2.22, 0.00], a=1.0, v=0.1) to the robot to move to this position but i receive an error which says: TyperError: Not a list
what do i need to do?

Hello Baptist

Does it move correct when the same data is send using the sockettest program ?

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/testing-tcp-connection-host-to-ur/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

I just tried on a real robot UR3 to send the command

movej(p[0.00, 0.3, 0.2, 2.22, -2.22, 0.00], a=1.0, v=0.1)

First to port 30002 with sockettest and it works ok – the robot move to the position.

And then from within Polyscope and it also works ok – the robot move to the position.

2. Baptist Vermeulen

in my case it is a UR5 robot and its connected with sockettest at ip 192.168.0.9 via port 30002 when i send the command through socket test i get the error again. could it be possible that the polyscope software is outdated.? do you have experience whit older software. the one i use is from jun10 2011

thank you

It can be the software version – however this command has been supported for a long time.

Does it work to set output ports ?

set_digital_out(1, True)

set_digital_out(1, False)

Does the output change ?

There should be newer CB2 software versions available.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

4. Baptist Vermeulen

yes changing the output is no problem. digital and analog outputs work fine via socket communication.

5. Baptist Vermeulen

hello, i have tried it with an pythonscript and i also get the error. Not a List

this was my python script
# Echo client program
import socket
import time
HOST = “192.168.0.9″ # The remote host
PORT = 30002 # The same port as used by the server
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
#while s.recv (“set
s.send (“set_digital_out(2,True)” + “\n”)
time.sleep(0.05)
s.send (“set_digital_out(7,True)” + “\n”)
time.sleep(2)
#a = joint accleration v = joint speed
time.sleep(6)
s.send (“movej(p[0.60934089359556, 0.35978986875958, 0.79496641615542, -0.626084509357156, 1.53763507120778, -1.22970663123821], a=1.3962634015954636, v=1.0471975511965976) “+ “\n”)
time.sleep(4)
data = s.recv(1024)
s.close()

Hello Baptist

I just tried the command movej(p[0.00, -0.3, 0.2, 2.22, -2.22, 0.00], a=1.0, v=0.1) on a UR5 CB2 robot running software version 1.8.14035 and it works OK.

This robot is also from 2011, but the software has been updated to 1.8.14035.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

13. Tolu

i want to use the UR5 robotic arm to place 400 mosaic tiles at different angles.i.e X,Y positions and angle. 20 tiles across the x-axis and 20 tiles across the y-axis. All with different angles. How can i do this without setting 400 waypoints manually which is time consuming?

Hi Tolu

Thanks for the question.

Maybe it can be consider to work with positions as variables. some examples at these links

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/position-variables-1/

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/planes-and-position-variables-1/

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/planes-and-position-variables-2/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

14. Ralf

Hello Lars,

My question relates to the pallet function.
The pallet is taught over 4 corners, also approach, departure and the gripping position are taught.

Since the pallets in the experiment are always different than the learned positions.
I want to hold the position of the pallet with a camera.

The camera delivers X, Y, angle.
Now I want to change the ideal (learned) pallet position by X, Y, angle.

I tried to change the position with ‘addPose’. In the following way
Add offset to conerposition or pick position. But it does not work.

If I make some changes within the pallet function, the robot remains after the first position (control loss loss) or moves to an unknown position.

Is there a way to move a pallet through offsets (X, Y, angle)?

Ralf

Hello Ralf

Thanks for the question.

Maybe consider to use the pose_trans because that is referenced to the tool head. It might be necessary to apply the pose_trans to each waypoint.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

2. Matheus Salati

Hi Ralf,

Did you find a solution to your problem? Im having same difficulties using camera values to offset the robot position, and the pose_trans is not working accordingly..

Thanks.

Hi Mattheus

Maybe also some informations at this link.

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/script-client-server/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Matheus

Thanks a lot Lars,

I´ll study this link and try to solve the problem… it looks like the pose-trans is working to offset in X/Y, but the rotation appears to be done through another reference that is not the TCP…

15. khushboo

Hi,
We are planning to move robot to simulate hand movement for testing a pedometer.

To simulate hand movement of a man walking at speed of 3km/hr, we need to calculate what acceleration/velocity values (in rad/sec unit ) should be passed as argument to movej commands.
We are not able to related walking speed in “km/hr” of a man with his hand movement acceleration in “rad/sec”.

Can you please provide any kind of help in this case?

Hi Khushboo

Thanks for the question.

I think you need to find or make some scientific kinetic data for a human walking. Such data are not to be found in the robot.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

16. Ron Klein

I have been working with a new application that will require me to re-position the tcp based on the size of the material being picked up. If I start a new program, and have the base mounting correct, I can move the TCP in the view mode straight when using the X or Y arrows on the move screen. I would like to know if it is possible to use a command to input those coordinates into the program waypoints so that when I want to pick up the pieces the TCP will be able to start in the correct position. Is this possible, or am I trying to make this to easy and need to calculate tehe positions using a more complicated method.

Hi Ron

Thanks for the question.

If you are in the programming mode then – Yes you can use the arrow keys and the “Set this Waypoint”. See also this link for informations on this

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/basic-ur-teach-waypoints/

If you want the trajectory to be linear then the waypoints need to be under a “Movel” – there are also some hints about that at this link

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/movel-linear-movements/

If you need to move during the program running then you can also use variables to change the position if you know how far to move then you can have these values in a variable and move based on the variables – some informations about that at this link

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/position-variables-1/

And it is also possible to crate a plane and make moves in reference to a user defined plane – some informations about that at this link

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/planes-and-position-variables-1/

And you can also make the program so the operator can input X and Y values – some informations about that at this link

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/operator-input-variables/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

17. Kevin Hess

Hi all,

I have a problem. The program I am working on requires a move to a position that will vary in the z direction depending on the product I am running. Once in position, I wan to just rotate the wrist 370 degrees. Then return to zero. I need this move to be either variable or relative because of the changing z height. When in do the move the wrist only rotates 10 degrees as if the robot is taking the shortest distance. Any reason why?

Hi Kevin

It might be because for the robot a turn of 370 degrees (which is more than one revolution of 360 degrees) is the same turn as 10 degress because the head will end up at the same position and therefore the robot only turns 10 degrees.

Maybe consider to use the Speedl command where each individual joint can be controlled – or more joints can be controlled at the same time. Some hints about Speedl can be found at this post

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/script-client-server-example/#comment-138926

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

18. John Schimandle

I’m working on an application that scans a large X,Y rectangular area with a small diameter treatment area. It makes use of a large number of pose_trans() script function calls to go back and forth over the rectangular area in a straight line and each time the robot path is incremented by the size of the treatment area. The program uses a large number of small distance moves to prevent the robot joints from flipping over into an unwanted position. This is very similar to a spray paint operation on a flat rectangular plate.

What I’m seeing is that I have to specify the Y coordinate change with the opposite polarity of the direction I want to go. So if I want to create a new pose variable in the +Y direction then I have to specify the Y increment given to pose_trans() as negative.

here’s a basic URscript program flow for a +Y coordinate change of 20 mm (0.02 m)

cur_pos = get_actual_tcp_pose()
new_pos = pose_trans(cur_pos,p[0,-0.02,0,0,0,0]) # note that you must change polarity to negative

I have a URscript file that runs the majority of the program and I have printed out the variables and it shows the calculation being done incorrectly. For example, the Y coordinate of cur_pos (cur_pos[1]) is -0.446 and when you apply the -0.02 Y offset the Y coordinate of new_pos (new_pos[1]) is -0.426.

I cannot find any documentation that would explain this behavior.

Any help would be appreciated.

1. John Schimandle

I think I just figured it out. Moving the TCP to a new location uses the coordinate system of the tool and not the base coordinate system of the robot. The Y axis of the tool is opposite to the Y axis of the base coordinate system during normal operation of this treatment area program.

I figured this out by rotating the tool head 45 degrees and then initiating the same set of commands. The robot follows the TCP coordinate system and not the base coordinate system.

Another interesting this is that the Move tab in the programming window does not show the tool coordinate system correctly when Feature is set to Tool. It always displays zeros.

Hi John

Thanks for the question.

Yes the “pose_trans” command is using the tool space coordinate system as reference. Whereas “pose_add” command uses the base as coordinate system as reference.

The X – Y orientation of the tool space can be seen on the image inside the Installation “TCP configuration” tab. This image can also be seen at this link

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/planes-and-position-variables-1/

The X – Y orientation of the base can be expressed as the positive Y direction is the direction where the robot cable is coming out of the base.

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/x-y-and-z-position/

It is normal the when the Move screen uses the feature “Tool” that all coordinates show 0 because the tool is already at the TCP point position. So in the Move screen and if one of the coordinate values are clicked then the Pose Editor show up – and if a coordinate value is changed in there while the feature is “Tool” – for example incrementing the X direction by 10mm – then the robot moves 10mm in the X direction of the tool orientation. This is best seen if the tool is turned maybe 45 degree so the tool is not aligned with the base coordinate system – then it is clear that a X direction move (with feature as “Tool”) is in an angle in relation to the base coordinate system – because it is following the “Tool” space coordinate system.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

19. Mathias Siig Nørregaard

Here’s my C# implementation based on Erwin’s math. I noticed you have to change the theta value depending on the sign of your roll. At least this works for me… please point out any flaws. Thanks.
Given a direction vector, convert that to a roll and a yaw and convert that to a rotation vector:
http://pastebin.com/JAmGmKDV

1. Harsh Sheth

Hello Mathias,

There is lack of C# implementation on this forum. I had a question regarding reading the 1044 byte incoming data from the robot for the pose. The python implementation has been shown but none of C#. I tried implementing it in C# and isolated the necessary 48 bytes for pose but then converting that hex to double gives weird scientific notation values with e-200 and such. Can you help with this?

Hello Harsh

C# is not the theme of this forum, but a little informations for a different function can be found at this link.

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/c-code-example-of-converting-rpyeuler-angles-to-rotation-vectorangle-axis-for-universal-robots/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

20. Michael

Hi! I am a student of UR Sysrem. Now I code a program to control the UR to move to a position in C++. As you know, controlling the UR need six parmeters with “movel” command, X, Y, Z, Rx, Ry, Rz, and X,Y,Z three parmeters can get from other way, such as Kinect. But Rx, Ry, Rz three parameter can’t get like getting X,Y,Z. So I need to calculate them according to X,Y,Z.What confused me is that I can’t find a equation between X,Y,Z and Rx,Ry, Rz. So about how to figure out the pose(Rx,Ry,Rz) with target point(X,Y,Z), do you have any idea?

thank you very much.
Michael

Hi Michael

Thanks for the question.

Rx, Ry and Rz is the rotation vectors for the tool head rotation. So how to calculate them depends on the desired tool head orientation. And such calculation involves maths that is not a default function in the robot, but some hints can be found at these links.

http://nghiaho.com/?page_id=846

http://www.euclideanspace.com/maths/geometry/rotations/conversions/angleToMatrix/

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/x-y-and-z-position/#comment-147810

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

21. Rahul

Hello,

How to move the robot safely without any collision with other components in the workspace to the home position if a power failure occurs or whenever the user says to do so from wherever the robot was before the power failure ?Is there any direct options in the new UR5 software using Safety Modes?

22. Victor

Hi Lars,

I have a question regarding the rotation vector. In the teaching pendant, I can set the tcp orientation by either “rotation vector” or “RPY”.

May I know exactly how these two notations can be inter-converted? As I would like to set the tcp via script and in script I can only use the rotation vectors.

Regards,
Victor

Hi Victor

I have not tried such conversion, but there has been an example of conversion in a previous post at this link.

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/x-y-and-z-position/#comment-147810

http://www.euclideanspace.com/maths/geometry/rotations/conversions/angleToMatrix/

http://nghiaho.com/?page_id=846

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Victor

Hi Lars,

Thanks a lot for the reply! The code in the example links works like a charm. It’s a shame that I didn’t read through the comments in detail.

Regards,
Victor

1. Nicholas Pestell

Hi Victor,

I too am working with the same problem.

I am using the script written by Erwin which seems not to work perfectly for me. I’m finding that it’s getting the angles slightly off. For instance, when trying to achieve (0,180,20) – rpy by plugging in the corresponding values from the script, I’m finding that the axis is slightly off what it should be and consequently the pitch is off by around 3 degrees. Have you experienced any of the same issues?

It seems to be that this could be related to rounding?

Nick

1. Victor

Hi Nicholas,

I’m re-implementing the code via C. I did face similar problem and once I further drill it down, it is the problem of sin(theta). Erwin mentioned to add exception for theta equals to 0, however, in the implementation, we need to check if sin(theta) equal to 0 or not.

I’m still researching on how to cater for that case.

Hope this helps.
Vic

2. Nicholas Pestell

Hi Lars & Victor,

Than you so much for your responses. I’m sorry that I didn’t notice them sooner and have hence only just got back to you.

Lars, I have confirmed that when I use 0,180,20 for roll pitch yaw respectively the program gives me different values to yourself. I actually get -0.537243984826, 3.04686204384, 0.0. These are different to the robot. It actually doesn’t make sense that I get values for the 0,180,20 case since at these values the theta value in “multi = 1 / (2 * math.sin(theta))” is 180 and hence there’s a division by zero. In fact, so far as I can tell the formulae breaks whenever two or more of the roll pitch yaw values add up to a multiple of 180.

So in the case for 0,180,20 there is certainly some sort of rounding error occurring. Which seems to be happening at the “math.sin(theta)” which doesn’t actually return 0. It seems for the reason outlined here;

http://stackoverflow.com/questions/15980819/exact-sine-cosine-tangent-of-various-angles

I’ve just realised this is essentially what you were saying Victor. I’m wondering if you’ve found a solution to the problem? Have you discovered a suitable alternative for working with the theta = n180 case?

Many thanks,

Nick

23. Filippo

Hi Lars,
What is the precision/resolution of the position given by the get_actual_tcp_pose() function? The emulator and so polyscope gives 0.01 mm. Is it reliable?

Thanks
Filippo

24. Erwin Damsma

Hello everyone,

I’m seeing some questions about converting RPY/Euler angles to the Rotation Vector/Angle Axis that the robot uses. I made this example code (in python), using information from a post from Jonathan.

###################################################
import math
import numpy as np

roll = 2.6335
pitch = 0.4506
yaw = 1.1684

yawMatrix = np.matrix([
[math.cos(yaw), -math.sin(yaw), 0],
[math.sin(yaw), math.cos(yaw), 0],
[0, 0, 1]
])

pitchMatrix = np.matrix([
[math.cos(pitch), 0, math.sin(pitch)],
[0, 1, 0],
[-math.sin(pitch), 0, math.cos(pitch)]
])

rollMatrix = np.matrix([
[1, 0, 0],
[0, math.cos(roll), -math.sin(roll)],
[0, math.sin(roll), math.cos(roll)]
])

R = yawMatrix * pitchMatrix * rollMatrix

theta = math.acos(((R[0, 0] + R[1, 1] + R[2, 2]) – 1) / 2)
multi = 1 / (2 * math.sin(theta))

rx = multi * (R[2, 1] – R[1, 2]) * theta
ry = multi * (R[0, 2] – R[2, 0]) * theta
rz = multi * (R[1, 0] – R[0, 1]) * theta

print rx, ry, rz
###################################################

Mind you that when all input values are 0, there will be a division by zero error at this line:
# multi = 1 / (2 * math.sin(theta))
In this case theta will be 0 and the sin of 0 is 0, so you’ll probably have to make a special case for this and set the rx, ry and rz to 0 directly.

Best regards,

Erwin Damsma

1. Nicholas Pestell

Hi Erwin,

Thanks for this. I’m essentially using this script to convert from Euler to axis-angle representation although I’m finding that it’s getting the angles slightly off. For instance, when trying to achieve (0,180,20) – rpy by plugging in the corresponding values from your script, I’m finding that the axis is slightly off what it should and consequently the pitch is off by around 3 degrees.

I’m wondering if anybody else has found this a problem?

Regards,

Nick

2. Victor

Hi Erwin,

May I know if there is any reference/definitions for that? I think I need to drill deeper for the sin(theta) equals to 0 case.

Regards,
Victor

3. Niels

Hello Erwin,

thanks for this.
Maybe you can help me.
I have developed an augmented reality application for the HoloLens that allows me to control the hologram of a UR10. The development environment is Unity (C #).

Although I have the opportunity to export Eulerwinkel. However, these are processed in Unity in the order Z, X ‘, Y’ ‘. In addition, the axes are in a left-handed coordinate system.

If you transfer this into a right-handed coordinate system (UR10), the result is the following order of the Euler angles: Pitch, Roll, Yaw (roll and pitch are reversed) -> I hope, at least that’s right :-D

Can you tell me how to customize your script to handle those Euler angles?

Thank you and best regards!
Niels

1. Niels

I have to correct something…

All three rotations in Unity seem to be applied in WORLD Space. So the order is:
z degrees around the WORLD Z axis, x degrees around the world X axis, and y degrees around the world Y axis. With positive Y pointing up, positive X pointing to the right and positive Z pointing backwards (in the screen).

This needs to be converted in UR10 order:
x degrees around the LOCAL X axis, y degrees around the local Y axis, and z degrees around the local Z axis. With positive Z pointing up, positive X pointing to the right and positive Y pointing backwards.

I have no idea how to do this…

Thank you very much!
Niels

Hi, Erwin!
Your code has really helped me out, I am new with vectors and is learning Python.

Can this vector in your script (Rx,Ry,Rz ) be seen as the directional vector of the rotation?
It is not quiet the same as the one described in: https://stackoverflow.com/questions/1568568/how-to-convert-euler-angles-to-directional-vector for example..

If you could help me out with this question I would be very grateful! I have asked this to many of my friends now but they can’t really understand it.

Best regards Max

25. Osiris

Hi,
Is it posible to initialize a variable that define a pose from a file and vice versa, save to a file a pose to use it as a calibration parameter?

Thanks

Hi Osiris

Thanks for the question.

It is possible to save variables as “installation variables” which will be stored in the file “default.variables” in the same directory as where the Polyscope program was created on the robot. Installation variables will survive a power cycle of the robot. Installation variables are created in the “Installation” and “Variables” tab on the robot and they will be created with the letter “i_xx” in front in order to indicate that it is installation variables. Below is an example.

This will create the file “default.installation” with the following contents.

And these installation variables can be fetched and used and rewritten during program run as the example below show. Notice the example show different ways to store a pose – either as the entire pose – or as individual elements.

The program runs through 3 waypoints and after arriving at a waypoint the position is stored in installation variables. If the robot is powered off and restarted – the robot will move to the last position in previous run. (Notice – as the variables “last_pos_1″ and “last_pos_2″ will get the same value, but with different method – the robot will move to that last position and wait 1+1=2 seconds).

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

26. Dominik

hi,
guess this question was answered before, but I couldn’t find it.
I’m currently working with the UR5 and a vision system, which gives me x,y,z and rz.
When I have no rotation included, everything works fine and the robot moves to the desired position, but when I have a rotation, the tcp moves not as expected.
With no rotation of the part, I have a tcp rotation rx=90° and ry=45° caused by my tool.
When I set rx there are no problems, but then when I try to set ry to 45° it tilts around the z-axis as well.
Doing it manually with moving the last joint to 45°, according to the base_feature rx=84°, ry=35° and rz=-35° although I only turned around the y-axis.
Same happens when I get rz from the camera and I want to set rz, it doesn’t move as it should.
I’m using the move_l command to perform this.
Is there another way to get my z-rotation correctly or can you explain me I get those values?
I only want to rotate around z with a defined angle.
Maybe I can use another command, actually I would be fine, with just setting the joint with a script command to the desired angle, cause then I could use the pose, the robot calculated

Hi Dominik

It might be because of the reference to the base that causes the move you observe. Moves can be referenced to the Base or TCP or User define planes.

Maybe consider to reference to the TCP instead like the example at the link below.

It is the command “trans_pose” that reference to the TCP – whereas a trans_add will reference to the Base coordinate system.

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/script-client-server/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Dominik

I’m doing it now with movej and get_actual_joint_position. like:
waypoint1
var1=get_actual_joint_position
movej([x,y,z,rx,ry+pi/4,rz)
I just tired to implement it with the trans_pose, but it wasn't working
Code was like:
var1=get_actual_tcp
var2=[0,0,0,0,0,d2r(45)]
var3=trans_pose(var1,var2)
movel(var3, 0.5, 0.5)
with neither of both, trans_pose nor trans_add it’s going to the correct position..
Where is my fault? I just want the Robot to rotate around the z-axis (at the current position around Joint 5)

Hi Dominik

When you say Joint(5) – do you mean the last joint (wrist 3) with the TCP ?

Also – what is your reason to using different commands assignments of var_1 which is “var1=get_actual_joint_position” and var1=get_actual_tcp ?

For the last one

var1=get_actual_tcp

have you tried

var1=get_actual_tcp_pose()

(Notice the extra “_pose” compared to your code.

and then
var2=[0,0,0,0,0,d2r(45)]
var3=trans_pose(var1,var2)
movel(var3)

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Dominik

sorry, was a bit sloppy writing..
with Joint 5 I mean the wrist 2, not the last one.
oh I don’t assign var_1 twice, it refers to two different programs, should have used two different variable namens in the example. With the fist method, using the Joints to get the movement, it’s working, so actually I’m fine with that, I’m just curious why it’s not working with the second method with the trans_pose command.
I used the get_actual_tcp_pose() command and the command is working, in var_1 I got the 6 values, also it executes the rest of the example without a problem but as far as my understanding goes, the Robot should go to the exact same spot, with both methods, cause in my case, rotating wrist two 45° is an Rotation only around the y-axis, and with the trans_pose command it should also perform a 45°Rotation around the y-axis…

Hi Dominik

As I can see you are making reference to “Ry” is one example (in below command the “ry+pi/4″ is in the position of Ry in the list).
movej([x,y,z,rx,ry+pi/4,rz)

Whereas in the other example you are making reference to Rz (in below command the "d2r(45)" is in the position of Rx in the list).
var2=[0,0,0,0,0,d2r(45)]
var3=trans_pose(var1,var2)

And also the command “trans_pose” uses the TCP coordinate system as origin for the translation calculation – whereas the movej uses the Base as origin for the move.

And whether the Wrist 2 turns around the Y axis depends on the pose of the robot. It will only be rarely and at a certain pose – that cause only the wrist 2 to turn because all 6 joints make up the pose and a turn around Y axis is not related to individual joints turning, but a turn around Y axis in the cartisian space which most often will mean more joints will turn.

Regards
Zacobria

27. Maarten

Hello,

I want to use a cognex camera for pick and place. When I translate the product the program works fine, but when I rotate the product, the gripper will rotate correctly but the new x and y position of the gripper is always off.

Maybe the problem is that my gripper is does not grab the product in a horizontal position (the z axis of the flange is not parallel to the z axis of the robot base when in pick up position). Might that causes a translation problem?

What I want is only a change in the x and y position w.r.t. the robot base origin, and a rotation along an axis in the center of the gripper that is parallel to the Z axis of the robot base, but apparently the other angles are changes as well since I’m apparently using the wrong transformation function?
I perform all the movements w.r.t. the base coordinates.

An example:

Original position
PosZeroPick = p[-0.56715, 0.04634, 0.39688, 0.0223, -2.9376, -0.0111]

Difference in position, obtained from camera measurement:
AddPos: p[0.05684, 0.033725, 0.0, 0.0, 0.0, 0.32190305]

Transformation

New position to pick up part
NewPosPick = p[-0.51031, 0.080065, 0.39688, -0.4489468, -2.9044101, 0.037219778]

Any ideas?

Thanks!
Maarten

Hi Maarten

Maybe two ideas to consider.

1.
Use of Plane feature. Then it is possible to have a coordinate system that is not parallel to the surface.

2.
Or instead of using the “Pose_Trans” then instead consider the “Pose_add” command. Because “Pose_Add” uses the base as reference whereas “Pose_Trans” uses the TCP as reference.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Maarten

Thank you very much for your quick response!

In the end the problem was that I did not set my origin correctly in the camera algorithm, the part is not grabbed in the center but quite far a way from it, but the markerposition of the origin was positioned in the center of the part. Therefore the correction was incorrect when the part was at an angle.

Maarten

28. Anita Macher

Hi,
I currently programming a UR10 via scripting. I want to move the robot with joysticks. I want the robot to move as long as the user moves the joystick in one direction. I use the speedl function. My problem is that the roboter is moving in the global coordinate system and not in the tcp coordinate system.
Here is a snipped code:

acceleration = 1.2
speed = [movingSpeed, 0, 0, 0, 0, 0]
speedl( speed, acceleration, 90000 )

The variable movingSpeed is calculated out of the input the joystick gives.
How can I move in the tcp coordinate system?

Anita

Hi Anita

With the script function “pose_trans” the transformation can be calculated based on the TCP coordinate system.

(With the script function “pose_add” the transformation can be calculated based on the Base coordinate system).

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Anita Macher

Hi Lars,

thanks for your answer. I tried this already but then I get an error message, that this function needs a pose and not a list. When I change the speed to a pose the “pose_trans” function returns no error but the speedl function because this function needs a list and not a pose.

Anita

1. Anita Macher

Hi Lars,

now it’s working. The tricky thing was to subtract the tcp pose from the result of the pose_trans function.

2. John

We are having an issue with a ur5 robot. We can not get the z axis to do read positive. It always reads negative. Any suggestions?

29. Bill

Hi,

We’d like to use a flat paint brush as the end effectuator. We need to control not only the brush’s center point and its center line’s orientation, but also how the brush rotates about its center line. However a target pose does not include an rotation aspect. I think the more general question is, can a UR robot control the self rotation of an end effectuator.

Regards
Bill

Hi Bill

You can make the robot turn around one axis – or around two axis or around all 3 axis. A Pose can be expressed and controlled as p[X, Y, Z, Rx, Ry, Rz] where Rx is rotation around the X axis – Ry is rotation around the Y axis and Rz is rotation around Z axis. It is possible to provide parameters for one or two or all three of these axis.

In this example there is a rotation around the Z axis. (It is a Client-Server example, but the principle can be the same locally on the robot.

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/script-client-server/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

30. Shahar

Im trying to calculate the transformation matrix (4×4) using the poisition vector (1×6). I know how to do that from a normal XYZ Roll-Pitch-Yaw vector, but the rx,ry and rz aren’t roll-pitch-yaw angles.
Is there a way to calculate the RPY angles using the position vector?

thanks,
Shahar

1. Shahar

That’s a 144 pages book, can you maybe be a little more specific?

Again, just to clarify, im looking to calculate a Rotation matrix from the rx,ry and rz i get from the “get_actual_tcp_position” command.
In the book you’ve linked they do it from the Euler Angles (page 12).

Hi Shahar

I think it is a good book.

You might consider to scroll down on this page to the posts by Jonathan from March 2015 through July 2015 (there are several posts). Especially the post 13 April 2015 and also 21 May 2015 below.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

31. Rahul

Hello,
How to extract x,y,z coordinates from p[x,y,z,rx,ry,rz] form to do some calculations like slope of a line between two points in script code?

Hi Philippe

Thanks for the question.

It is possible to read the cordinates from the MODBUS registeres – an example at this link

Or it is also possible to read the cordinates from the data streamed out from port 30003 – an example at this link.

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/client-interfaces-cartesian-matlab-data/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

32. Rahul

Hello,
How to add a offset pose value to all waypoints in the program. I used like this:-

waypoint1
waypoint2
.
.
.
.
.

then when i start i am getting a pop up saying “The robot has changed mode From OK to Inintializing, and its going to initializing screen.When i do inintialization and tried to start again the pop up repeats.

Hi Rahul

Thanks for the question.

Is the order of reference as intended

or

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/working-with-planes-and-variables-2/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

33. Parker Hewitt

This answer may be somewhere, but I have looked all over the place and am not fully understanding what I need to do. Currently, I made a plane on the teach pendant. I can move in that plane with the moveL command with its “feature” as the plane I have created. such that, I give it the waypoint p[0,0,0,0,0,0] and it moves to the orgin of my plane. or I give it p[0,0.2,0,0,0,0] and it moves .2 meters in the Y direction in relation to my plane. However. I am needing to know how to get the pose that I am at in that toolspace. It seems get_actual_tcp_pose() returns the value in base coordinate space. Is there a simple way to translate this value to my plane coordinate space. Such that, I tell it to move +0.2 in a direction and then get the value back such that it returns +0.2 in my coordinate space instead of arbitrary values in base coordinate space. I notice if you go to the move tab. you can select what feature you are in reference too and it shows what I am talking about! if I select the plane I have created it gives me the x,y,z,rx,ry,rz in reference to the plane I have created. This is exactly what I need to know. Also, is there a way for me to send a simple offset for example p[0,0.2,0,0,0,0] and it moves 2 in my y value, but not to the absolute position p[0,0.2,0,0,0,0] in my tool space but instead just an offset of where it currently is + 0.2 in y from it’s current position. I imagine it’s something along the lines of sending my offset I want to move. Translate my current position in base coordinates to my current position in my plane coordniates. then adding my offset and having the robot moveL to the new found translation. Any help would be greatly appreciated!!

Hi Parker

You might consider the pose_trans(var_1, var_2) commads where
var_1 = get_actual_tcp position

and

var-2 is the offset to move.

Yes the get_actual_tcp position return the result in base frame, but when a lane has been set the plane variable pose is provided which can be seen in the variables tabs – and the plane pos might be possible to use directly instead of var_1 method described above.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Marc

Hello,

I experience exactly the same problem. I have defined a plan (PLAN_D is my variable created by the system). I have a pose (P_A). I need to express my pose P_A in PLAN_D coordinates. here are my program lines:

set_tcp(TCP1_var) // Select the TCP i need
P_A=get_actual_pose() //Get my actual TCP1 pose in base coordinates
P_Result=pose_trans(Plan_D,P_A) //Express my pose in Plan_D coordinates

As it doesn’t give the right coordinnates, i’m certain i don’t fully understand the pose_trans function.
Could you tell me what i miss with an example (program lines) ?
Thank you

Hi Marc

A simple way to explain the “Pose_Trans” is that it “transforme” from a origin with a reference to a new pose. So the first argument is the origin and then transform to a new pose with the reference which is the second argument.

And “Pose_Trans” uses the TCP coordinate system whereas as “Pose_Add” uses the “Base” coordinate system as reference.

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/position-variables-1/

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/planes-and-position-variables-1/

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/planes-and-position-variables-2/

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

1. Marc

I understand the pose_trans() function now. It seems to me that, alone this function is not really usefull because that operation is directly made by the system when using movel() or movep(). I found too how to express a point in a defined base. I just needed to inverse my defined plan:

Example with my previous program:

set_tcp(TCP1_var) // Select the TCP i need
P_A=get_actual_pose() //Get my actual TCP1 pose in base coordinates
P_Result=pose_trans(Pose_inv(Plan_D),P_A) //Express my pose in Plan_D coordinates

It seems to work.
Thank you

34. Elvis

Hello Lars,

We are trying to experiment with speedL and stopL script commands but we are not having much luck. It executes the code but the robot does not move at all. My understanding of the speed command is that it can be used as an offset from the current position by specifying which direction/orientation you want to move. Please see the code below and let me know if you have any insights on why we cannot get the robot to move.
This is the same Loctite dispensing application I posted about last week. We are going to be using a 2D vision system to offset the robot for unit to unit variation. In your opinion, what is the easiest way to accomplish the offsetting of the robot path? There can be slight rotation (+/-15deg) and maybe 150mm variation on x and y. Needs to be smooth and has a combination of linear and circular moves.

Program
Robot Program
TargetObject≔[0,0,0,0]
s≔.025
a≔.025
t≔1
OffsetX≔0
OffsetY≔0
OffsetR_D≔0
OffsetR_R≔d2r(OffsetR_D)
MoveP
HomePosition
MoveP
finalpos
speedl([0.042,-0.007,0.0,0.0,0.0,0.0],s,a,t)
stopl(1)
speedl([0.012,-0.01,0.0,0.0,0.0,0.0],s,a,t)
stopl(1)
speedl([0.012,-0.01,0.0,0.0,0.0,0.0],s,a,t)
stopl(1)
speedl([0.076,-0.01,0.0,0.0,0.0,0.0],s,a,t)
stopl(1)
MoveP
HomePosition
Wait: 0.1
TargetObject≔[0,0,0,0]
Wait: 3.0

Hi Elvis

I just tried this program below wish takes the essential parts of speedl from your program

Program
Robot Program
MoveJ
Waypoint_1
speedl([0.042,-0.007,0.0,0.0,0.0,0.0],0.25,0.25,1)
stopl(1)
Wait: 3.0
speedl([0.012,-0.01,0.0,0.0,0.0,0.0],0.25,0.25,1)
stopl(1)
Wait: 3.0
speedl([0.012,-0.01,0.0,0.0,0.0,0.0],0.25,0.25,1)
stopl(1)
Wait: 3.0
speedl([0.076,-0.01,0.0,0.0,0.0,0.0],0.25,0.25,1)
stopl(1)
Wait: 3.0

And it works OK – the robot moves at the speedl commands – it is just a very small move, but it moves. I insterted waits of 3 seconds just for testing so I could have the time to observe the robot actually move and it does.

Yeah your idea is one way to do it, but I think the speedl command is much affected of the time setting because it runs based on the time which can can give a distance.

Another way is to use pose_trans command which is often used where there is a vision. Pose_trans can be used so the robot will move from where the robot is currently and then with a given offset which is the second argument in the pose_trans.

One example here

http://www.zacobria.com/universal-robots-knowledge-base-tech-support-forum-hints-tips/knowledge-base/script-client-server/

and one example here

http://www.universal-robots.com/how-tos-and-faqs/how-to/ur-how-tos/vision-system-with-the-robot-15576/

When using a vision system the offset can then be the cordinates given by the vision camera.

Author:
By Zacobria Lars Skovsgaard
Accredited Universal Robots support Centre and Forum.

Also check out the CB3 forum

35. Veit

Hello Lars,
first of all, I have the indigo-devel version of the ur5 package using a 01/2010 roboter version with the URControl version 1.5.7849.

When I am in the move-tool on the robot control panel, I can rotate every joint manually and continuous. Is it possible to call this from a program instead of from the touch panel? I’d like to write a program in Matlab/ROS which sends a command like “move TCP in +/- X Axis” or “rotate joint Y”, both continuously as long as the command is publishing or active. So like without waypoints.

When using ROS, I need a fixed end point and the robot would stop after each command and calculate a new path.

Regards,
Veit

Hello Veit

I think it is best to upgrade the software to the latest 1.8 software for CB2 controller.

Then try and look at the script command ServoJ for joint moves or even better SpeedJ, SpeedL for pose moves.

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

1. Veit

Hello again,

thanks for the fast reply. I think I have the CB1 Controller, i just found out that my controller serial number starts with CB 1-xxx. So can perform any update?

Kind Regards
Veit

36. Tjomme

Hi,

I have the Ur5 setup on a table and teached all positions. Suppose I now want to use the robot on a Different location with another process in a different setup and when that is finished go back to the previous setup. I will never get the robot to work in *exact* the same location.
Is there a easy way to get round the need of teaching all positions again? for example, can I run a beforestart calibration program that teaches 3 points and make the robot’s world coordinates relative to those 3 points?
I imagine there is at least one way to do so: by making all positions variable to these three ponits, but that’s quite a job.

Hi Tjomme

Your idea with the relative waypoints sound like a good approach.

My experience is that you can use the dowel and get the robot back in a fairly precise position – and then there is often only a very few waypoints that needs to be very accurate in a typical robot program, which is the picking and place position – the intermediate waypoints in between often don’t have to be within fractions of millimetre. So after installing the robot back then these few waypoints can be checked and fine tuned if necessary – that would amount to a similar amount of adjustment if relative waypoints was used and it is easier to handle in the programming.

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

1. Tjomme

Sadly, for this particular case I cannot dovel the robot in place. I can only (re)place it in about the same location manually. I’ve got many points where exact position is required and relocation of the robot may occur daily. Hence, teaching all points each time is not doable.

37. Jonathon

Hi again

I have a question of my own now.

I want to move from position p1 to p2 where p1 and p2 are in the format [x,y,z,rx,ry,rz]. However as I calculate these values during program execution I have no idea where they will be. All I know is that it will be a valid robot position, not out of its reach. However I found that often when moving from p1 to p2 the robot hits itself.

Given that I don’t know p1 and p2 – they could be anything – is there a way of moving from one to the other guaranteeing that the robot wont hit itself? I did reset the robot to its ‘home’ position. However this takes too long after every movement. Way points would be difficult as I have no idea of the p1 and p2 values. Any suggestions would be much appreciated

Cheers
Jonathon

Hi Jonathon

Thanks for contributing to our forum.

The robot will attempt the move and if it hit something including itself or meet a singularity it will safety stop. So if you don’t know where the positions are and if p2 is far away from p1 then there is a risk the robot will hit something especially if you use MoveL maybe less if you use MoveJ.

So the approach to go to a known place in-between might be a good choice – you should be able to calculate and thereby known where about the robot is – for example looking at X, Y and Z will tell you in what quadrant you are and if the robot is not in the target quadrant when you start moving it might be an idea to create intermediate waypoints to go through first – similar for Rx, Ry and Rz.

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

1. Jonathon

Thank you for getting back to me so quickly! I’ll see if its possible to calculate something based on these values. Sometimes the difference is so small yet it still collides I think with something like that it becomes difficult to automatically determine a safe point. I’ll see what I can come up with though

Jonathon

Hi Jonathon

When you say “difference is so small yet it still collides” – do you mean there is a very small difference between p1 and p2 ?

How much is “small” ?

If so – are any of the joints already near collision before the move ?

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

1. Jonathon

Yes that’s right and in some cases yes the joints are already very close to collision before moving to p2 from p1.

This is mainly due to the fact the positions are being automatically calculated in my application. If the actual position it is trying to get to results in a collision UR5 knows this and says it is in an invalid move, it doesn’t however give me any warning about a collision during the movement from p1 to p2 until it has happened.

38. Andrew

Although many people here have asked the question, I feel like I still don’t have a good understanding of the exact definition of rx, ry, and rz. I have gone through the steps above and this all seems clear up to the point when multiple rotations are given in different axes.

Lars stated “The Rx, Ry and Rz is the rotation measured in Radians of the tool axis i.e. the three axis at the tool head.” That’s a start, but there are so many ways to measure rotations of axes (Euler (and then which convention and intrinsic or extrinsic), Tait-Bryan, Axis-Angle, Quaternions, etc.) and it seems like UR doesn’t use any of these formats directly in the rx, ry, and rz.

Jonathan had a good explanation, but I’m unclear why the forward kinematics are needed in determining the rx, ry, and rz. That is, if I have a certain orientation and position, and I am supplying the controller with that pose, wouldn’t the inverse kinematics solver figure out the joint angles?

1. Jonathon

Hi Andrew,

I used forward kinematics as I required the matrix associated with these values, the forward kinematics value given by the UR forward kinematics command is simply a position vector, which was no use for the calculations I needed.

The inverse kinematics does indeed figure out the joint angles from some [x,y,z,rx,ry,rz] vector. Hope this helps?

Jonathon

39. Brandon

We have a requirement where we need to display the Teach Pendant values from the “move screen” in an external application.

We can do this via the real-time status interface, however, the real-time status interface appears to provide the current position of RX/RY/RZ in RPY [rad] values, instead of what’s displayed on the TP screen (Rotation Vector [rad]) values.

Is there a formula that can be used to calculate back and forth between the two values?

40. Ara

Hi
I am a beginner with ROS and UR10. I want to know how we can move for instance 3mm up, down, right or left of the currency position? I saw in this forum some codes about moving to the P position of the joint position with specific acceleration and velocity but I do not completely understand how I can move the tip of manipulator a specific amount in diffrent directions?

41. Sakul

Can you give me a Example for this? I don’t understand it with the GUI-Touchpanel Programming to communicate with the Computer…

Best Regards

Program
BeforeStart
‘Open a socket and wait for it to connect successfully’
var_1≔socket_open(“192.168.0.101″,12345)
Loop var_1≟ False
var_1≔socket_open(“192.168.0.101″,12345)
Wait: 0.5
‘Move the robot to waypoint1′
‘This acts as the origin for the vision coordinate system’
MoveJ
Waypoint_1
Robot Program
‘Wait for 3 floating point values e.g from camera to be received over the socket’
‘If successfully receive the values, scale them and copy into a pose’
If var_2[0]≠0
var_3≔p[var_2[1]/1000,var_2[2]/1000,0,0,0,d2r(var_2[3])]
‘Transform the pose into the coordinate system of Waypoint_1′
var_5≔get_actual_tcp_pose()
var_4≔pose_trans(var_5,var_3)
‘Move to the received position, wait for 1 seconds then move back to Waypoint_1′
MoveL
var_4
Wait: 1.0
Waypoint_1

42. Sakul

Hello Lars,

is there a possibility to move to Robot via UR-Script to the X,Y,Z Coordinates?
I communicate via a socket Connection with the Robot and i also can move the Robot via this Connection. The Touch-Panel on the UR5 Shows me the Coordinates, but how can i tell the Robot that it should move to this Position via UR-Script?

Hi Sakul

Yes it is possible to give a pose in this format

movel(p[X, Y, Z, Rx, Ry, Rz], a, v, r)

The “p” means Pose. (Not to confuse with the “p” in “movep”).

It can also be a movej or movep – pose

movej(p[X, Y, Z, Rx, Ry, Rz], a, v, r)

movep(p[X, Y, Z, Rx, Ry, Rz], a, v, r)

You can see some data examples in this example

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/movep-process-move-circular-move/

If you don’t have the Rx, Ry and Rz data then you can give then as you see in the move screen in radians like shown in the example.

a is acceleration
v is velocity

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

1. Sakul

Thanks Lars,

so the coordinates refer to the base of the Roboter right?
Is there a possibility to set a new coordinate System?
I will get a solution that i can set the XYZ Vector to a new Position.

regards

Hi Sakul

Yes in the script command it is with the base as reference.

When programming in the GUI you can set a new reference point in the Features menu.

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

1. Sakul

Hello Lars,

so when im programming with the UR-Script i can’t set a new coordinate System to the Robot?

For example, i have a application with a camera which detects some things. This Application calculates some coordinates for me, now i have to tell the Robot where the things are. Otherwise the Robot can’t catch the things.

I can’t use the GUI-Programming Interface, becouse its not flexable enough. Thanks for your fast Support.

Best regards

2. Sakul

I need to Communicate more flexible with the Robot. So it’s needed that my PC-Application can receive the X,Y,Z Coordinates from the Robot. It’s also important to send the Robot the Coordinates, where the Robot should move.

So the next Point is, that i need an Interface between the Camera and the Robot and that Interface must be the Computer.

I have a Matlab-Calculation-Function which is needed to detect the things via the camera. Than i grip this informations and need to ReCalculate this Coordinate-Informations to the Roboter-Coordinates -> This Roboter-Coordinates are send via the UR – Script to the Roboter.

I hope you can understand my Problem a little bit now.

Best regards

Hi Sakul

Yes and you can exchange data using the GUI.

I actually find the GUI much more flexiable than writing script – then in the GUI you can make your own coordinate system and only exchange data (send and receive) variables with the host running the Matlab application.

Regards
Lars

43. Paul

Hello,

is there a function to convert a joint position to a tool pose?

Input should be:
[Base, Shoulder, Elbow, Wrist 1, Wrist 2, Wrist 3]

and the output should be:
p[X, Y, Z, Rx, Ry, Rz]

Paul

Hi Paul

I am not aware of such function.

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

1. Paul

That’s strange. Isn’t that a transformation the robot has to do everytime he displays the tool coordinates?

1. Jonathon

Hi Paul,

This is what forward kinematics does, if you calculate the transformation from base to end effector (the final joint) you can then calculate [x,y,z, rx,ry,rz] from this.

Forward kinematics is calculated using [Base, Shoulder, Elbow, Wrist 1, Wrist 2, Wrist 3]

Jonathon

44. Philip

I alread look at page 13, equation 2.9 and 2.10 that describes how to get the final coordinate.
But I couldn’t understand how to get Rotation vector(three values) with one matrix. Could you explain more detail?

Hello,

I have a question related to the movement of the robot when the tool has a fixed position. I understand that the last six buttons from the left side of the “Move” tab are used to increment/decrement Rx, Ry and Rz (this scenario was explained very good). I’m talking about these buttons:
Rotation_FixedTool
My questions is:
If we have the following scenario:
Random robot pose: X=59.37, Y=-712.27, Z=-9.50, Rx=-0.5230, Ry=2.990 Rz=-0.4808
And I send the following command:
movep(p[X=59.37, Y=-712.27, Z=-9.50, Rx=-0.7230, Ry=0 Rz=0],a, v, r) (we can see that Rx was incremented and Ry,Rx were set to ’0′)
Which is the expected behaviour?
1)The robot should be calculate his joint positions and move around the X axis without changing Ry, Rz.
2) The robot moves around the X axis and resets Ry and Rz to 0.

I actually don’t understand how “movep” works. If I want to increment just Rx (simulate the functionality of one button for example) is it enough to send movep[fixed_x, fixed_y,fixed_z, new_Rx, 0 0 ] ?

Thnsk you very much for your support.

When you use movep and with a p (means Pose) in front of the 6 Cartesian coordinates – then the robot to the given position (Pose) in space – and it is best to provide all 6 coordinates in the command – I don’t know what happens if you leave some out like Ry and Rz – maybe they are interpreted as 0 then.

And when you move Rx – then Ry and Rz and also X, Y, and Z is likely to change – that depends where in space the Pose is.

Instead you can try and use the command speedl – because in that one you can give 5 of the coordinates the value 0 and only move in one coordinate.

Try to see this comment on April 24 – 2014

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/script-client-server-example/

So if you give a command like

speedl([0.0,0.0,0.0,0.02,0.0,0.0], 1.2, 1.0)

Then robot will only turn around the X axis (Rx – Rotate x) – But that means also the other coordinates will change – but the robot is only turning around the X axis.

Try to put the robot in a well define pose where the tool head is flat with the surface – then it is easier to understand.

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

Wow, Thank you for your quick response.
I observed than when we press the buttons (Rotation_FixedTool) the x,y and z remains unchanged. So, if I want to simulate this, is it enough to send this command:

speedl([unchanged_x,unchanged_y,unchanged_z,0.02,0.0,0.0], 1.2, 1.0) ?

Thank you very much,

46. cy_lee

Also, when the robot in the home position, from the move pad, it shows the robot Rx=0, Ry=2.21xxx, Rz=-2.24xxx, which almost equal to 120 degree. Why they are not 180 or 90 ?

1. Jonathon

This is a good example of creating the projection matrix [R|t]

http://www.itk.ntnu.no/ansatte/Gravdahl_Jan.Tommy/Diplomer/Kufieta.pdf#page74

Though when converting back to euler angles you actually get the RPY which leads me to my question

Do you know how the rotation vector (Rx,Ry,Rz) and RPY vector (Rx,Ry,Rz) relate?

I have the RPY vector from euler calculations and I’m desperately trying to figure out how this relates to the rotation vector. Basically I want to use the move function and provide RPY instead of the rotation vector that is seen on the move screen. They’re both rotations around x,y,z but work differently and differ from each other. Any help would be much appreciated

Hi Janathon

Thanks for your contribution and very interesting study.

I do not know the relations and Euler equations, but maybe the script command d2r(d) can help (degree to radians).

Author:
By Zacobria Lars Skovsgaard
Authorised Universal-Robots Distributor.

1. Jonathon

Thank you for your reply. Unfortunately not :( I’ll keep searching. I’ll post an update when I find out for anyone else who is interested

Thank you again

Jonathon

2. Jonathon

Unfortunately that doesn’t seem to do what I’m after.

Basically I’ve calculated a rotation matrix from the joint angles and from this I’m getting the correct values for the first 3 of the tcp (the translation in mm) the second 3 only match the RPY. However I need to somehow convert these RPY values to the standard rx,ry,rz ones to move it around. The pose_trans I’m guessing just assumes that it’s already in the stand rx,ry,rz. Both versions are in radians so unfortunately changing this doesn’t help either

Jonathon

Hi Jonathon

I dont know if it can help you to put the robot so all Rx, Ry and Rz are 0 – on this page you can see that pose.

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/x-y-and-z-position/

(Scroll down a little)

The head is turned “upside down” and pointing upwards.

And then see what that is in RPY.

And then do some trans_pose move on single Rx or Ry or Rz using your angles with d2r in trans_pose and see if that can help to establish the relationship.

Regards
Lars

4. Jonathon

Thank you

Been trying this for some time. No joy

I’ve found out that the RxRyRz rotation vector has some sort of definition: the length of the axis is the angle to be rotated and the vector itself gives the direction, though that’s as far as I’ve managed to get so far.

2. Jonathon

Managed to solve it! :)

The vector is simply an axis angle and the reason I thought it wasn’t was due to an error in my calculations, I forgot to normalise it first. Here’s a link to wiki for anyone else trying to solve this; http://en.wikipedia.org/wiki/Axis%E2%80%93angle_representation

Basically you just need to calculate the angle and then the vector (using the rotation matrix) to obtain the rotation vector

Cheers
Jonathon

1. Helen

Hi, I have a same problem with you.
I know how to calculate RPY but have no idea about the Rotation Vector.
With given D_H para and each joint angle, forward kini matrix can be calculated.
with this matrix, RPY value can also be calculated.
so my question is how do you get Rotation vector with the matrix?

2. Philip

Hi Jonathon

I had the final coordinate to get matrix and looked at Page 13, equation 2.9 and 2.10. But I couldn’t understand how to get Rotation vector(three values) with one matrix. Could you explain more detail?

3. Philip

Hi Jonathon,

I could get the final matrix. I already look at page 13, equation 2.9 and 2.10.
But I couldn’t understand how to get Rotation vector(three values) with one matrix. Could you explain more detail? This issue confused me for a long time.

Philip

4. Jonathon

Hi Phillip,

I’ve only just seen this comment and I’m not sure if you’re still watching this thread. Anyway you need to calculate the forward kinematics for the robot which is the transformation from the base to the end effector, information on that here; http://www.itk.ntnu.no/ansatte/Gravdahl_Jan.Tommy/Diplomer/Kufieta.pdf#page74

Once you have that you use that matrix to calculate the rotation vector which is explained here;

So in the example below T is my forward kinematic matrix, the UR5 only returns a vector, so you need to calculate the matrix from the first link then (in c#) I do

``` double theta = Math.Acos(((T[0, 0] + T[1, 1] + T[2, 2]) - 1) / 2);```

``` var rotationVector = new DenseVector(3); double multi = 1 / (2 * Math.Sin(theta)); rotationVector[0] = multi * (T[2, 1] - T[1, 2]); rotationVector[1] = multi * (T[0, 2] - T[2, 0]); rotationVector[2] = multi * (T[1, 0] - T[0, 1]); ```

``` rotationVector = rotationVector / rotationVector.Norm(2); var e = theta * rotationVector; ```

where e is my final rotation vector

5. Jonathon

Phillip

I forgot to say, to calculate the forward kinematics you use the joint positions which are obtainable by using get_joint_positions()

Cheers
Jonathon

6. Michael

Thank YOU Jonathan…man did you just save me from tearing my hair out. There is a brief mention in the UR Script mention that they use axis-angle representation…which I thought might be Equivalent Angle-axis representation in the robotics literature, but I wasn’t sure and didn’t figure it out until you mentioned that the vector had to be normalized. I couldn’t figure out why until I went back to my robotics text and it says that how the angle is compactly encoded. Then it all fell together.

7. Shahar

Hi Jonathon,
I hope your still tracking this posts.
I want to solve the inverse problem, and calculate the rotation matrix according to the rx,ry,rz.
Can you maybe help me?

thanks!

8. Jonathon

Hello Shahar

I am indeed still around, apologies for the late reply.

So basically what you have (rx,ry,rz) is an axis angle, so the angle is already known and the axis of rotation. If you check out this link:

http://nghiaho.com/?page_id=846

it shows you how to convert your axis angle to a rotation matrix.

Let me know if you have any questions, but it should be pretty straight forward.

9. Shahar

Hi Jonathan.
The link you sent describe how to build a rotation matrix from euler angles, not axis angle.
Maybe im missign something…

47. cy_lee

Vria_1=Pose_Trans (Base, Tool)
Vria_2=Pose_Trans (Tool, Base)
Both of the return results are P[digits], and these digits are the same.

48. cy_lee

Hi Lars,

When we use the Pose_Trans (Base, Tool) and then Pose_Trans (Tool, Base). The returned results are the same. Theoretically, they should be different. Could you help to check this point? Thanks

Cy

49. cy_lee

Hi Lars,

I also tried T2=Translate (x,y,z) *Rot(Rx) * Rot(Ry) *Rot (Rz), it still cannot get the right position of the object. Thanks

Cy

50. cy_lee

Hi Lars,

May be my questions is not clear. I means the order of the rotation. For example, From object to TCP, I got the homogenous transformation matrix is T1, now I want to get the transformation matrix from TCP to Robot Base T2. For T2, I get the controller parameters, x, y, z and Rx, Ry, Rz. Usually, T2=Translate (x,y,z) x Rot(Rz) * Rot(Ry) *Rot (Rx), however, after calculation like that, the results is not correct. So now, I am wondering how to do such transformation. Is my method correct?

Thanks
Cy

Hi Cy

I do not have the formula. I would think maybe the equation also should contain the physical construction of the robot such as arm lengths and joint width etc., but I have not used that.

Regards
Lars

51. cy_lee

Hi Zacobria,

Do you know how to Rx, Ry, Rz to construct the Rotation Matrix? Right now I need to transfer to local coordinate to Robot Tool, which I have done. Then transfer the robot tool to robot base. However, I donot know how to use the Rx, Ry , Rz since we donot know the inner calculation rules of UR5. Thanks

Cy

Hi Cy

I do not have these calculation equations. There are some general information’s on the links shown below.

Regards
Lars

52. Jacco van der Spek

Can rx, ry, rz be seen as the roll,pitch and yaw in a reference frame that is defined at the tool tip?
Or am I missing a step here?
On second thought: or is it a reference frame that is defined in the same way as the base but translated to the x,y,z of the tooltip?

Hi Jacoo

Definitely “No” to the second thought – and “Maybe” to the first thought.

It is like this.

Rx – is rotation around the X axis.
Ry – is rotation around the Y axis.
Rz – is rotation around the Z axis.

The X, Y, Z is from the base and you can think about it like a coordinate system from the base.

For example Y axis is along the axis where the grey cable is coming out (on UR5), and X axis is perpendicular on that – and Z axis is going straight up and down.

Put the robot in Base view (on the Move screen).

Put the robot so the tool head is above the Y axis pointing down i.e. 0 on X axis and maybe 400mm on Z axis.

Now change the Rx just a little bit – and you can observe that the robot head is turning around the X axis (like nodding).

Now change the Ry just a little bit – and you can observe that the robot head is turning around the Y axis (like turning the head to the side).

Now change the Rz just a little bit – and you can observe that on the robot head – it is only the tool flange is turning around – because it is turning around the axis going straight up and down.

(Note all these three tests above were performed when the robot head was back in the position with the head pointing down wards).

Regards
Zacobria

1. Jacco van der Spek

In the mean time I have been able to figure out that rx,ry,rz are a rotation vector or angle axis representation (I didn’t know that notation). By now I get the concept. And the reference frame is just the same as for the base.
Tomorrow I will be able to play around a bit with the robot so I think I will be able to figure out what is what.
Using this: http://www.euclideanspace.com/maths/geometry/rotations/conversions/angleToEuler/
I will be able to convert the values to yaw, pitch and roll of the tooltip I suppose.

53. Amir masoud

Hi Lars,
I need to send a pose through TCP/IP from python to poly scope.
I checked the rotations, i.e. Rx, Ry and Rz. If I set each angle individually, e.g. Rx, the robot moves as it is expected. For example, I set the (Rx=pi/2, Ry=0,Rz=0) and the robot’s end effector rotate pi/2 around the x axis
But, when I set the next rotation, e.g. Rx=pi/2,Ry=pi/2,Rz=0, I cannot understand what axis does robot rotate around. I checked, it does not rotate around the Y axis of the reference frame at the base of the robot.
My question:
Can you please help me to understand how to compute (Rx,Ry,Rz) given an orientation of the tool tip.
For example, I need to have robot at
X = 0
Y = 80
Z = 550
Tool tip X axis in robot base frame = [1 0 0]
Tool tip Y axis in robot base frame = [0 0 1]
Tool tip Z axis in robot base frame = [0 0 -1]
How can I compute (Rx,Ry,Rz)?
Amir

1. Amir masoud

Sorry, I forgot to add the example completely:
X = 0
Y = 80
Z = 550
Tool tip X axis in robot base frame = [1 0 0]
Tool tip Y axis in robot base frame = [0 0 1]
Tool tip Z axis in robot base frame = [0 0 -1]
This orientation is obtained by Rx=pi/2
but how can I get the following one,

Tool tip X axis in robot base frame = [0 1 0] —> x’ = y
Tool tip Y axis in robot base frame = [0 0 1] —> y’ = z
Tool tip Z axis in robot base frame = [1 0 0] —> z’ = x

x, y, z are robot reference frame axis.
Rx=?
Ry=?
Rz=?

Thanks,
Amir

Hi Amir

The calculation of 6 axis is a complicated equation especially the rotation vectors and I don’t have any examples. There is a link where you can learn some more.

http://en.wikipedia.org/wiki/Axis_angle

This is why it is good to use the robot in teach mode i.e. because then the robot will do all these calculations for you.

Regards
Lars

1. Victor

Hi thanks for your reply, I’ve made a new plane and i’ve set the new points. I can move the robot to the point i want if i manually put the x,y,z,rx,ry,rz on the “move” screen. But when trying to move the robot to one of the points using movel(p[x,y,z,rx,ry,rz]), the robot moves to a completely different point.

Hi Victor

Yes correct.

The feature is for the controls in Move screen so the robot moves in relation to your Feature setting.

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/features-screen/

If your data is send from external the origin is the robot base. It is possible to calculate a new origin e.g. an external camera position and add or subtract that to your pose coordinates.

For example if you mount the camera above a typical area for the robot and keep the camera at ”0” on the X axis – then the X cordinate will be the same for Camera and Robot. If the camera is mounted 500 mm up on the Y axis then you add 0.500 (500 mm) to you Y coordinate from the camera.

Regards
Zacoibria.

54. Victor

I’m getting a rotation of a detail from a vision system, i.e. 209.05. the rotation is presented in degrees. How do I convert it to the rotation of the robot tool. From what i understand i need the RX,RY,RZ?

Hi Victor

It depends on how many dimensions your feed back is from the camera.

If only one dimension I would use only the Rz by converting your angle degree into Radians with simple math.

Regards
Zacobria
Universal Robots distributor Singapore.

1. Victor

Thank you for your help, but I have one more question. When moving the robot to the pose, i use “moveJ” but get the following error: ‘movej(): SingularityException: getInverse(): no inverse solution”
What does this error message mean? I don’t get the error while using moveL. Whats the difference between moveJ and moveL?

Hi Victor

“J” and “L” has the same meaning as when programming in UR-GUI.

movej means a “joint” move i.e. the robot will calculate a joint rotation move or can also say the easiest path for the robot turning to joints – so the path is not following a certain pattern.

movel means a “linear” move i.e. the robot will calculate a linear path to the next waypoint – so the joints will move so the tool head follow a linear path to the next waypoint.

My experience is that I can get the error messages you mention when using movel – and the error will occur if it is not possible to take a linear path to next waypoint typical because of the robot will run in to it self.

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/singularity/

Regards
Zacobria
Singapore.

1. Victor

Thank you, apparently it works if i add a number to the RY. not sure why.
Is it possible to create a new “coordinate system”? I have a vision system that I need to work with the UR, and I need the UR to work with the vision systems coordinates

Hi Victor

A way to do that is the put the Feature on the Move screen to “Base” and then offset your coordinate data from the robot base position to your camera position.

Regards
Zacobria

55. Peter L.

Hello Lars,
is it possible to define an new coordinatensystem with an other origin for the ur10? How can we make that possible?
Thank you!

56. Peter L.

Hallo,

i want to give the Robot a variable Position like p[modbus_1, modbus_2, modbus_3, modbus_4, modbus_5,modbus_6]

i want to receive the Position values via modbus/TCP by anonther Controller. I cannot put variables in the pose p. Is there any way to give the Robot it’s Position via modbus/TCP e.g. from a plc?

Hi Peter

This example shows how to get external variables for position from a PC and you should be able to that that also from a Modbus node.

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/script-client-server/

Scroll down until where it says

“UR program that ask for the position coordinates from PC host.”

Instead of a PC program then If you can read the Modbus register then you can use the same method to insert the variables to the pose.

Regards
Lars

57. Mustafa

Hello Lars,

Thank you for the quick and easy to follow replay. Now Feature option make more sense :).
I guess I should explain first the task we want UR5 to carry to make my question more clear. In our project, there will be a rectangular object with known dimensions, positioned randomly on the working place. Using a Vision System we will find the center point of the object or one of its corner, and this point coordinates will be passed to UR5 over Ethernet. Then or TCP should move to certain points over the area of the object using the Center Point (or corner) that was received from the Vision System as a reference point.

Of course we can measure the angle of the rotation of the object in relative to Base axes and recalculate the points’ positions (e.g. instead of moving 20mm on the object x-axis moves 7mm on base x-axis) but it will just be more complicated specially when we have more than 100 points the TCP should reach.

Therefore, we were asking to see if it possible to set a reference point or reference coordinates instead of the base coordinates for using movej(p[x, y, z, rx, ry, rz]). Obviously we have to do that using the Script code or inside the UR5 using GUI Programming but not manually with Move window , is that possible?

Best Regards,
Mustafa

Hi Mustafa

You might consider to change the TCP offset which can be done in the ”Installation” – ”TCP Position” screen for example change the X, Y offset, but that will not change the angle.

If your origin and orientation of objects in the field of view change for each object placed in the field of view – which they normally do in vision applications then I think you still need to calculate the angles in order to set new reference coordinates because the robot will need to know each objects location and orientation – which comes from the vision system anyway.

If the orientation of the objects is always fixed, but at an angle in reference to the robot – you might as well consider to mechanically change the orientation of the vision camera when you capture the image and then record how much the camera angle is changed if the camera is mounted on the robot head – then the robot can help you to get the angles and make the calculation easier.

I would go for using the robot base axis as the reference and then calculate the offsets and create a formula. This is what I do in the vision applications shown on our channel.

Regards
Lars

58. Mustafa

Dear Lars,

Thank you for this easy guide on how to use UR5 robot. I have couple of questions regarding robot positions and movements. Firstly, why we can’t see Tool Position Coordinates if we set Feature to “Tool” ? And in case I have an object with tilled alignment (i.e. 45° on x-axis) how can I tell the robot to move 20mm on the x-axis of object (not on base x-axis) ? Is it possible to set a reference points or reference axes?
Last question regarding movej(p[x, y, z, rx, ry, rz]), in this command we are also moving the TCP relativity to the Base, is it possible to move TCP relativity to Tool Pose as we do when we select Feature = Tool in Move window?

Thank you for the help
Best Regards,
Mustafa

Dear Mustafa

Thank you very much for your question.

1.
When the Feature is selected to ”Tool” then the origin is the TCP point and therefore the coordinates are 0,0,0,0,0,0 seen from the TCP point.

The use of Feature is to select the reference from where the robot move for example when using the Arrow keys on the left side in the Move screen – which is also reflected on the graphics.
Notice that the Graphics show the Tool as the origin when ”Tool” is selected as Feature.

When Base is selected then the Arrow key is moving the robot as seen from the base in X, Y, Z direction e.g. Y direction is the direction where the cable is coming out of the base.

2.
Yes you can.

In the ”Installation” screen go to ”Features” and select ”Line”. Set two points that is for example the end points of your 45 degree offset X axis.

Go back to Move screen – Now in Features – select ”Line_1” instead of ”Tool”.

Now the robot moves along that line when using the Arrow keys (just below Up/Down) on the left side of the screen.

There is more about the “Features” here.

http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/features-screen/

3.
I am not sure I understand the question because when ”Tool” is selected then the TCP is the origin i.e. 0,0,0,0,0,0 – also after you move the robot the TCP is still 0,0,0,0,0,0 – so to give a move command in relation to TCP will not work when ”Tool” is selected because the reference point is moving.

Maybe try to use the “Line” feature as the reference.

Regards
Lars

1. nalu

Dear Lars,
Thank you for this easy guide on how to use UR5 robot. Can you tell me how the UR5 robot Rx, Ry, Rz position is described? I want to give the robot specified angle position and let it run the posture which I wanted. I have try some description method, eg.X-Y-Z Eulerian angle, Z-Y-X Eulerian angle, but can not get the right description.
Best Regards,
nalu

59. Pablo Jesus Perez

Hi! I am a student of robot systems and I’m currently programming a UR5 via ROS. I am trying now to move it with “movej” and a pose ( “movej(p[...])” ) but the robot cannot figure out the solution for the inverse kinematics. I have moved the robot by myself and seen that it is possible for it to move to that point (in fact I have which is the inv. kin. result since I could read it on the UR screen). I have also tried to pass it acceleration, velocity, time and radius but still nothing, the same error appears in the screen: “movej():SingularityException: getInverse(): no inverse solution”.

Just in case, I have also tried other movements (which I have also checked by moving the robot myself beforehand) and the same error keeps turning up. Could you help me with this problem, please?

Thank you,
Pablo.

Dear Pablo

Try to perform the command when the robot is close to the target position and let me know if that’s possible.

Regards
Lars

1. Eric

Hi Lars,

I am having the same problem as Pablo, even when the robot is close to the target position. Futhermore, if I don’t provide acceleration/velocity/radius, I get an error “Expected lvalue Pose, but received List”.

Thank you,
Eric

1. Eric

Just received the error again, and I had misremembered the exact message. It was actually, “TypeError 2: expected lvalue type Pose but found List”

Hi Eric

If you can show the code it will be easier to guide you.

I looks like the robot does not get the data it expect for the command used e.g. if the number of elements in an array is incorrect.

A move command based on a pose will need a ”p” like this example.

movej(p[0.6, 0.6, 0.2, 0.62, 5.25, 0.17]

A move command based on joints is without the ”p” like this example.

movej([-4.436072065494562, -1.3509294688607585, -1.5898662740424647, -1.7720278195761425, 1.5692639625048612, -2.8652732472940476], a=1.3962634015954636, v=1.0471975511965976)

If the number of elements is wrong I guess you can get the error you see.

Even if you are near the target it could still be a singularity along the way e.g. if a joint has to move more than it can perform within the time allowed.

Also check that none of the joints are at the out most position. Each joint can turn +/- 360 degree (if not crashing into itself) i.e. total 720 degree, but it cannot go beyond that if a joint already have reach the end.

If you are using blends make sure the blends are not overlapping.

If you are using circular move – make sure the radius is not 0.

Regards
Lars

2. Eric

Discovered the solution to my problem. I was using a variable ‘q’ for both ToolPose and JointPosition, but a variable can only be one or the other. Its type will not dynamically change with assignment.

Hi Eric

Thanks for your messages and good that you found the solution.

Feel free to show your method on the forum.

Regards
Lars

4. Eric

Also, thank you for your quick response, Lars!

I have another question. With the movej command (using JointPositions) in urscript, I expected that the robot would always go to the specified JointPositions. However, it appears that the robot will instead perform forward kinematics to determine the ToolPose, and will then perform inverse kinematics to find the optimal path to that ToolPose. This behavior can cause the robot to physically collide with other objects in its vicinity, which is why I wanted to specify joint angles in the first place.

I think that this behavior also causes problems when I specify a TCP-offset. I have found that the robot’s movej path will be different, depending on the TCP-offset. What I want instead is for the robot to move to the JointPositions as I have them specified, and only use the TCP-offset to correct the telemetry reported via status messages. I do not want the TCP-offset to affect the physical path of the TCP itself.

Thanks again,
Eric

Hi Eric

Yes when using MoveJ the robot is a waypoint oriented robot – especially with MoveJ i.e. the robot will calculate its own path between waypoints – and the longer between the waypoints the more different paths can be the outcome.

Either have more waypoints along the way – or consider using MoveL or MoveP where the path is better controlled.

Other commands to try are the SpeedJ, SpeedL, StopJ and StopL – they are very useful.

Regards
Lars