forked from aiclub-igdtuw/AIMLMonth2023
-
Notifications
You must be signed in to change notification settings - Fork 0
/
23055_anishkagupta_minorproject.py
342 lines (246 loc) · 13.4 KB
/
23055_anishkagupta_minorproject.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
# -*- coding: utf-8 -*-
"""23055_AnishkaGupta_MinorProject.ipynb
Automatically generated by Colaboratory.
Original file is located at
https://colab.research.google.com/drive/1UbPGIVuM0m2aTy-q-feVFaeiop3GIYFd
# **MINOR PROJECT**
---
# **ROBOTIC KINEMATICS DATASET**
# **TASK 1** - Exploratory Data Analysis
<-----------------------Question 1----------------------------->
### **How are the joint values (q1, q2, q3, q4, q5, q6) generated for a robot? Can you explain how these values are determined?**
"""
!pip install roboticstoolbox-python
!wget https://www.kaggle.com/datasets/sandibaressiegota/robot-kinematics-dataset
# Commented out IPython magic to ensure Python compatibility.
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
# %matplotlib inline
sns.set(color_codes=True)
import warnings
warnings.filterwarnings('ignore')
from sklearn import tree
from sklearn.model_selection import train_test_split
from sklearn.tree import DecisionTreeRegressor
from sklearn.preprocessing import MinMaxScaler
data=pd.read_csv('kaggle.zip')
type(data)
data.shape
data.dtypes
columns=['q1','q2','q3','q4','q5','q6','x','y','z']
data=data.loc[:,columns]
data.head(10)
q1 = np.random.uniform(-2.3562, 2.3562)
q2 = np.random.uniform(-1.0472, 2.0944)
q3 = np.random.uniform(-3.1416, -0.7854)
q4 = np.random.uniform(-2.9671, 2.9671)
q5 = np.random.uniform(-1.5708, 1.5708)
q6 = np.random.uniform(-3.1416, 3.1416)
print("Joint Values (q1, q2, q3, q4, q5, q6):", q1, q2, q3, q4, q5, q6)
"""This line generates a random value for q1 using the np.random.uniform function. The function takes two arguments: the lower bound (-2.3562 in this case) and the upper bound (2.3562). The function returns a random floating-point number within the specified range, and that value is assigned to q1 and similarly for q2,q3,q4,q5,q6.
<-----------------------Question 2----------------------------->
### ***How are the x, y, and z coordinates calculated based on the joint values of a robot? Can you explain the relationship between the joint values and the Cartesian coordinates?***
"""
def forward_kinematics(joint_angles):
# Define your robot's DH parameters here
dh_parameters = [
[0.1, 0, 0.2, joint_angles[0]],
[0.3, np.pi/2, 0, joint_angles[1]],
[0.15, 0, 0, joint_angles[2]],
[0, np.pi/2, 0.25, joint_angles[3]],
[0, -np.pi/2, 0, joint_angles[4]],
[0, np.pi/2, 0, joint_angles[5]]
]
# Initialize the transformation matrix
T = np.eye(4)
for i in range(len(joint_angles)):
a, alpha, d, theta = dh_parameters[i]
# Calculate the transformation matrix for the current joint
A_i = np.array([
[np.cos(theta), -np.sin(theta)*np.cos(alpha), np.sin(theta)*np.sin(alpha), a*np.cos(theta)],
[np.sin(theta), np.cos(theta)*np.cos(alpha), -np.cos(theta)*np.sin(alpha), a*np.sin(theta)],
[0, np.sin(alpha), np.cos(alpha), d],
[0, 0, 0, 1]
])
# Multiply the transformation matrix with the overall transformation matrix
T = np.dot(T, A_i)
# Extract the position (x, y, z) from the transformation matrix
position = T[:3, 3]
return position
# Example usage with the provided joint values
joint_angles = [-1.51, -0.763, 1.85, -0.817, 0.912, 2.32]
end_effector_position = forward_kinematics(joint_angles)
print("End-effector position:", end_effector_position)
"""The code allows you to compute the position of the robot's end-effector based on the given joint angles, which is useful for planning and controlling the robot's motion in Cartesian space.The code is functioning by using the Denavit-Hartenberg (DH) convention to describe the kinematic relationships between different joints in the robot manipulator. It follows a loop that iterates over each joint angle, calculates the transformation matrix for that joint, and accumulates the transformations to determine the final position of the end-effector.
<-----------------------Question 3----------------------------->
### **Which joint variables (q1..q6) have a significant impact on the x, y, and z coordinates? Can you analyze their influence?**
"""
def calculate_end_effector_position(joint_values):
x = np.sin(joint_values[0]) + np.cos(joint_values[1])
y = np.sin(joint_values[2]) + np.cos(joint_values[3])
z = np.sin(joint_values[4]) + np.cos(joint_values[5])
return x, y, z
def determine_joint_impact():
joint_values = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0] # Initial joint values
# Calculate the end-effector position with the initial joint values
x0, y0, z0 = calculate_end_effector_position(joint_values)
joint_impacts = []
for i in range(len(joint_values)):
original_value = joint_values[i]
# Incrementally vary the joint variable by a small step
step = 0.01 # Adjust the step size as needed
joint_values[i] += step
# Calculate the end-effector position with the updated joint value
x, y, z = calculate_end_effector_position(joint_values)
# Calculate the change in position
dx = x - x0
dy = y - y0
dz = z - z0
# Store the impact of the joint variable on the Cartesian coordinates
joint_impact = [dx, dy, dz]
joint_impacts.append(joint_impact)
# Restore the original joint value for the next iteration
joint_values[i] = original_value
return joint_impacts
# Example usage
joint_impacts = determine_joint_impact()
# Print the impacts of each joint variable
for i, joint_impact in enumerate(joint_impacts):
print("Joint q{} impact: {}".format(i+1, joint_impact))
"""The code calculates the impacts of each joint variable on the Cartesian coordinates of the end-effector by incrementally varying each joint value and observing the resulting changes in position. This information can be useful for analyzing the sensitivity of the end-effector position to different joint variables in a robotic system.
<-----------------------Question 4----------------------------->
### **Can you show the relationship between joint values and the corresponding x, y, and z coordinates using scatter plots or visualizations?**
"""
def calculate_end_effector_position(joint_values):
x = np.sin(joint_values[0]) + np.cos(joint_values[1])
y = np.sin(joint_values[2]) + np.cos(joint_values[3])
z = np.sin(joint_values[4]) + np.cos(joint_values[5])
return x, y, z
def plot_joint_coordinates(joint_values):
# Calculate the end-effector position for each set of joint values
x_values, y_values, z_values = [], [], []
for joint_set in joint_values:
x, y, z = calculate_end_effector_position(joint_set)
x_values.append(x)
y_values.append(y)
z_values.append(z)
# Create a scatter plot
fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')
ax.scatter(x_values, y_values, z_values, c='b', marker='o')
# Set labels and title
ax.set_xlabel('X')
ax.set_ylabel('Y')
ax.set_zlabel('Z')
ax.set_title('Joint Coordinates')
# Show the plot
plt.show()
# Example usage
joint_values = np.random.uniform(low=-np.pi, high=np.pi, size=(100, 6)) # Generate random joint values
plot_joint_coordinates(joint_values)
"""The code calculates the Cartesian coordinates of the end-effector for given joint values using trigonometric functions and then creates a scatter plot to visualize the joint coordinates in 3D space. This can be useful for understanding the relationship between joint configurations and the resulting end-effector positions in a robotic system.The sine and cosine functions are fundamental trigonometric functions that relate angles to the ratios of sides in a right triangle. In the context of the provided code, these functions are used to calculate the Cartesian coordinates (x, y, z) of the end-effector based on the given joint values.
By applying these trigonometric functions to the joint values, the code incorporates the joint angles into the calculations and derives the corresponding Cartesian coordinates of the end-effector. This allows for the visualization and analysis of the end-effector's position in relation to the joint configurations.
<-----------------------Question 5----------------------------->
### **Are there any unusual or extreme values in the joint variables or the corresponding x, y, and z coordinates? Do they indicate any interesting patterns or anomalies?**
"""
# Define the joint values and corresponding x, y, z coordinates
joint_values = np.array([-1.51, -2.84, -1.23, -1.99, 1.05, 0.762, -0.0943, -1.38, 2.75, -1.42, -1.45, -1.35, 2.75, -0.627, -1.61, -0.781, 2.46])
x_coordinates = np.array([-0.092, 0.142, -0.0833, 0.135, -0.056, -0.168, 0.00422, -0.0954, -0.00242, 0.0448, -0.0137, -0.102, 0.00315, -0.107, -0.127, -0.0267, -0.0688])
y_coordinates = np.array([0.15, -0.1, 0.223, -0.0314, -0.229, -0.0712, -0.0616, 0.235, -0.15, -0.169, 0.192, 0.098, -0.142, 0.143, 0.153, -0.0989, -0.131])
z_coordinates = np.array([0.301, 0.225, 0.206, 0.37, 0.26, 0.245, 0.12, 0.355, 0.209, 0.049, 0.238, 0.164, 0.382, 0.427, 0.286, 0.34, 0.282])
# Plotting joint values
plt.figure(figsize=(10, 4))
plt.subplot(1, 4, 1)
plt.scatter(range(1, len(joint_values) + 1), joint_values)
plt.xlabel('Data Point')
plt.ylabel('Joint Value')
plt.title('Joint Values')
# Plotting x, y, z coordinates
plt.subplot(1, 4, 2)
plt.scatter(x_coordinates, y_coordinates)
plt.xlabel('X Coordinate')
plt.ylabel('Y Coordinate')
plt.title('X-Y Coordinates')
plt.subplot(1, 4, 3)
plt.scatter(x_coordinates, z_coordinates)
plt.xlabel('X Coordinate')
plt.ylabel('Z Coordinate')
plt.title('X-Z Coordinates')
plt.subplot(1, 4, 4)
plt.scatter(y_coordinates, z_coordinates)
plt.xlabel('Y Coordinate')
plt.ylabel('Z Coordinate')
plt.title('Y-Z Coordinates')
plt.tight_layout()
plt.show()
"""The provided scatter plots of joint values and their corresponding Cartesian coordinates do not show any unusual or extreme values. There are no clear patterns or anomalies observed in the distribution of the data points. The values and coordinates appear to be within a certain range without any significant clustering or correlation.
# **TASK 2** - Classification/Regression
Perform following steps on the same dataset which you used for EDA.
> - Data Preprocessing (as per requirement)
> - Feature Engineering
> - Split dataset in train-test (80:20 ratio)
> - Model selection
> - Model training
> - Model evaluation
> - Fine-tune the Model
> - Make predictions
"""
columns=['q1','q2','q3','q4','q5','q6','x','y','z']
data=data.loc[:,columns]
data.head(10)
type(data)
data.shape
data.dtypes
data.info()
data.describe()
data.isnull().sum()
fig, axs = plt.subplots(9,1,dpi=95, figsize=(7,17))
i = 0
for col in data.columns:
axs[i].boxplot(data[col], vert=False)
axs[i].set_ylabel(col)
i+=1
plt.show()
import pandas as pd
# Assuming you have a CSV file named 'data.csv' containing the relevant columns
data = pd.read_csv('kaggle.zip')
# Verify the loaded data and column names
print(data.head())
# Step 1: Handle missing values (if any)
data = data.dropna()
# Step 3: Split dataset into train-test
from sklearn.model_selection import train_test_split
# Split the dataset into input features (X) and target variable (y)
X = data[['q1', 'q2', 'q3', 'q4', 'q5', 'q6']]
y = data[['x', 'y', 'z']]
# Split into train and test sets (80:20 ratio)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Step 4: Model Selection
from sklearn.linear_model import LinearRegression
# Choose a linear regression model
model = LinearRegression()
# Step 5: Model Training
model.fit(X_train, y_train)
# Step 6: Model Evaluation
from sklearn.metrics import mean_squared_error, mean_absolute_error, r2_score
# Make predictions on test set
y_pred = model.predict(X_test)
# Calculate evaluation metrics
mse = mean_squared_error(y_test, y_pred)
mae = mean_absolute_error(y_test, y_pred)
r2 = r2_score(y_test, y_pred)
# Step 7: Fine-tune the Model
# Step 8: Make Predictions
# Step 9: Summarize Model's Performance
print("Model Evaluation Metrics:")
print("Mean Squared Error (MSE):", mse)
print("Mean Absolute Error (MAE):", mae)
print("R-squared (R2):", r2)
"""The provided code demonstrates the process of training a linear regression model on a given dataset and evaluating its performance using several metrics. Here is a summary of the model's performance evaluation:
Mean Squared Error (MSE): The mean squared error measures the average squared difference between the predicted values and the true values. It provides an indication of the model's accuracy. A lower MSE indicates better performance.
Mean Absolute Error (MAE): The mean absolute error calculates the average absolute difference between the predicted values and the true values. It represents the average magnitude of the errors made by the model. Smaller MAE values indicate better performance.
R-squared (R2): The R-squared value represents the proportion of the variance in the target variable that can be explained by the model. It measures how well the model fits the data. R2 ranges from 0 to 1, with a higher value indicating a better fit.
In the provided code, the linear regression model is trained using the training data (X_train and y_train). The model's performance is then evaluated by making predictions on the test data (X_test) and comparing them with the true values (y_test). The evaluation metrics (MSE, MAE, and R2) are calculated using the predicted values (y_pred) and the true values. Finally, the metrics are printed to summarize the model's performance.
"""