Merge remote-tracking branch 'origin/master'

This commit is contained in:
Pavel Lutskov
2019-02-28 15:23:14 +01:00
22 changed files with 603 additions and 17 deletions

13
.gitignore vendored
View File

@@ -16,5 +16,16 @@ CMakeLists.txt.user
# Ignore PDFs on master # Ignore PDFs on master
literature/ literature/
# Pictures stuff # Pictures
*.png *.png
# But not the necessary ones
!docs/figures/**/*
# Presentation stuff
*.pptx
# fuck apple
.DS_Store
# Latex Stuff
build/

64
README.md Normal file
View File

@@ -0,0 +1,64 @@
# Teleoperation NAO
## Project description
This ROS package allows you to remotely control a NAO robot by standing in
front of a webcam and having two ArUco markers on the hands and one ArUco
marker on the chest.
You can move a NAO around using a "Human Joystick" approach and make NAO
imitate your arm motions. For more details, read our
![report](docs/report.latex).
Our package relies on the NAO being reachable from the computer and the
environment variable `NAO_IP` being set to the IP address of the NAO.
Our main code is written in Python and located in `script/` with the ArUco
detection and rviz GUI logic in C++ in `src/`.
## Requirements
* ROS Indigo on Ubuntu 14.04
* [aruco_ros](http://wiki.ros.org/aruco_ros)
* [nao_robot](http://wiki.ros.org/nao_robot)
* [nao_meshes](http://wiki.ros.org/nao_meshes)
* [usb_cam](http://wiki.ros.org/usb_cam)
We installed the dependencies by cloning them to our workspace and running
`catkin_make`, but `sudo apt install ros-indigo-package-name` should work even
better.
## Usage
Hang an ArUco #0 marker on the chest, take the #1 in the left hand and #2 in
the right hand. The markers should be 15cm high/wide. Then you can run
```sh
$ roslaunch teleoperation calibration.launch
```
The NAO will tell you what to do.
After that you can launch the teleoperation routine by running.
```sh
$ roslaunch teleoperation fulltest.launch
```
If you want our awesome GUI, run
```sh
$ roslaunch nao_bringup nao_full.launch
$ roslaunch teleoperation with_gui.launch
```
If any problems arise with the GUI, such as the NAO not getting properly
displayed, try pointing the rviz to the correct NAO `.urdf`, which should be
somewhere in the `nao_description` package.
This diagram depicts what you should expect from the teleoperation routine.
![State Machine](docs/figures/teleoperation_overview.png)
Since this package relies on the NAO voice recognition, you must be in the same
room with NAO, and the room must be quiet.

BIN
docs/figures/aruco.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 282 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 92 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

View File

@@ -0,0 +1 @@
<mxfile modified="2019-02-11T16:57:25.345Z" host="www.draw.io" agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/9.3.1 Chrome/66.0.3359.181 Electron/3.0.6 Safari/537.36" etag="tWMgcySLVndEoJfxuZ_C" version="10.2.2" type="device"><diagram id="EtaSLEvNrXQQW6Ab2D67" name="Page-1">7Vddb9sgFP01fqwU48RpHpu06Sat06RsWrc3aqhNho2HceP01+9i8Ad2+rUlmjTtpeUe7sVwzj2IeMEqra4lzpMbQSj30IRUXnDpIbSYTeCvBvYGmE99A8SSEQP1gA17pBa0dXHJCC2cRCUEVyx3wUhkGY2Ug2Epxc5Nuxfc/WqOYzoCNhHmY/QrIyqxqB8uuol3lMWJ/fQ5mpuJFDfJ9iRFgonY9aDgygtWUghlRmm1olxz1/Bi6tZPzLYbkzRTrym4/Xj9ucDft3SP56vtl/z9I0nOzs0qD5iX9sB2s2rfMACrANkQLHcJU3ST40jP7EBuwBKVcoh8GOIiNwrcs4rCR5f3IlNWUT/UMeN8JbiQAGQi00sSXCQ6t16gUFL8aGkGgpZ2d1QqWj15bL8lE5qQipQquYcUW4Cmln/bgCi08a6TM2hykp6SLYhtC8Xt2h3LMLBEv4H06YhjKcqM1DxMXE5pxdRtb/zNpujxZdUP9k2QwRZv+8G3pl4HXVEdNVVbqtTeaoVLJQASUiUiFhnmH4TI2zXIhXZVpyAga6bPX6/jShi0ElIystlAQOBDlDKiz/DW+B/LmKpn8tDhhpCUY8Ue3H0cXVz/pI7qOchDQRAsFut1S/vAW0cwT7BwzTP1x+Y55J2TWcf3x2xCa21saE/+op0mb7BT2/ARx0XBIqfn/Td4Z3gdntQrf+oBW/pJMPhy2w/T2eAyDQZCG2/aqoHW7TZ+X370L5lr5g/IXPxlc81G7EoPhVxpDso7h+bwZ6nfLXVPnxV1U19AAgrzqianmYdRrP9vTUeYxWBz9XpmaqQgMKpcqQ4qcOA1gTmLM+1UUIACvtT6MHjMXdiJlBHCn+oN99bouRWFNl7jlHGt1gqnd5Jh2PdN/cwz2fYI6Ai9EQxeLa3xer0xO/RoOVVvhKfrDbhN/3fG62+N6ezF9+z8OLcGhN3PE3OFd7/xgqtf</diagram></mxfile>

View File

@@ -0,0 +1 @@
<mxfile modified="2019-02-11T14:48:46.525Z" host="www.draw.io" agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/9.3.1 Chrome/66.0.3359.181 Electron/3.0.6 Safari/537.36" etag="brBwS-_WlH2769IkzczF" version="10.2.2" type="device"><diagram id="550VrYQlVZvYAyL5ooMI" name="Page-1">5VjbUtswEP2azNDOpOMLdsojCYE+AO1MygCPSqzYBttrZJk4/fpK1voWN6kJAdzpSyIdaVfalfbsygNzEmYXjMTeFTg0GBiakw3Ms4FhnFia+JXAWgGjY10BLvMdBdWAmf+LIohybuo7NGlM5AAB9+MmuIAoogvewAhjsGpOW0LQXDUmLm0BswUJ2uit73APUd0+qQa+Ud/1cOmvxkgNhKSYjJYkHnFgVYPM6cCcMACuWmE2oYH0XeEXJXe+ZbTcGKMR7yJgLWazZP40dTPTCafT6/Hd481QN5SaZxKkaPHAsAOhcLwEoVd4jCzUgP2Uyq2OJyScM5+IoStpYomLliv/WaEgSefSbL4Omgqk4mGSn/SpmKCP4qytJR0Yk8SDNHBoqVDYlutUM8plYhLtv84RzB8+1RfItZUrkDAWjWieyL9cu3Y4+xYeTXhfjBu+l2lHxbHWd6YuW7Ezo7G4QR0RktgFxj1wISLBtELHDNLIofKia6Ln8TAQTV00aebzOwl/sbB3j5Nk+yyrd9ZFJ+JsXROS3fv6WCWW9wq5B8r5GhmMpBwEVO32EiDGPSWcwWPJJga6FQX1UemBemhjtCeQMgzHLfGMFEmYS/mOeRj20rO1BZA4LiiEVFgmbwQNCPefm2RIkFPdcl5FO6KBzPMSFmqR0E0iw16bqGukTQCY40eEi1SweTuqs5fOXXk+p7NYkdZKpKTmfah72t7l6WfKOM12+qYYHSHBY4YrCX9VyxcF5tVSha29kTtPtnL66+iEbaeS17NGROA/oPze2ScumqFdHqUq3ggLhaT2OceuT7+X0OEYuStL7svCf2dX2T8noR9IkY1q5jXca3TkXqsj9yKfDHUTiaIzGaOuH+DnJRwSlbVJVNYGAamdo1S9kiwUFRNhuUwob/FUuff9qatdjmImmJWB0+tkYG/6WPvoZKC3k2s/KnxF+BxYAr1hwwOyfU/zmTE+sIU7DvBdK/3DVd9/fjMUr4mXvRnKTLYlW+2fbayO2ab4kNKTUt9qsZEqM/5RfjePP5zftS0e/akCs+lOFYZz1inqbRH1ebhvBH7vTmXzCWa+4RNMdKtvdqrsqT58mtPf</diagram></mxfile>

BIN
docs/figures/rviz_human.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 118 KiB

View File

@@ -0,0 +1 @@
<mxfile modified="2019-02-11T16:14:32.326Z" host="www.draw.io" agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/9.3.1 Chrome/66.0.3359.181 Electron/3.0.6 Safari/537.36" etag="tVnTRcwO-PyhhK9-fgSm" version="10.2.2" type="device"><diagram id="nT2Y2rybpLhVDeGiO-tx" name="Page-1">7Vvbdps4FP0aPyYLEDc/xs61ndtaTlamfZNBNrQYUZATu18/Eki2BCQmDhjSTh5idBACztna2joSIzBdbW5SmAR/Yh9FI0PzNyNwOTKMsaXR/8ywLQyOqReGZRr6hUkyzMKfiBv5dct16KNMqUgwjkiYqEYPxzHyiGKDaYqf1WoLHKl3TeASVQwzD0ZV62Pok4BbdXu8P3GLwmXAb+0aTnFiBUVl/iZZAH38LJnA1QhMU4xJcbTaTFHEfCf8Ulx3/cLZ3YOlKCZNLggenW+WHmzd61t9dm+ZiXH99cw0imaeYLTmb8yflmyFC1K8jn3EWtFGYPIchATNEuixs8805tQWkFVESzo9XOCY8CjqNi9PcYTTvC2g5X/MHkaRsMc4pvUnPsyC/Dasnerb8Rd+QilBG8nE3/YG4RUi6ZZWEWeF57ciaLz8vA8kcLktkGJoCCPk4Fnu2t77lx5wF7/B3bpZ4277x5phYPI5ZA2IUiUMyKfA5EWckgAvcQyjq711ogZKCgrahORfZj53LF78wmux48uNXNiKQkzfWL6Klb+IFllhf11eEhd+Q4RsOQbgmmBq2j/vHxgnAioSBCg0fAu5vkntGUnxdySdcY05sO3dGdEPjcNwMy/H4xxuL8Ipw+vUQ6+FjJMOTJeIvFLPLeqxKL0KzhRFkIRPKr20DjQBfQlo15Bd1gWoZEzpTTGlQErrBlIL10OeVwepuWuZlnYMpBigTgQpevtBYQp0AR9dAo/ET4fwo6BHIqiWOQkid1ELINtz0XxRC6DugWEMCxdu3aAWEd53cnEm/LYf3YDp59wsmewl/+WWi3SVKeej4ve6aFZU74DSFEwaDTGpq5g02sJkA4p680haJjXnUnOck2DXHBR29Qp073yKkZclsN5AAh8VDTmeLYhfYKviFxhV8bsTyLL4tbvSvmZVkrRDEze4AUkU5nl6OtpoPJRJTNGmFmpRRtfO2mRoH08bblMt1BdtnH2azIJ5cOVk+K+b2QMaaz/cGtroSl0fJ49KQ1FX8ugE+roU8I5BZr/Kqdo5cDR+z8Y4y1u7SFO4lSokOIxJJt3sH2bY38lxVPY2bEvG7MH6pmuVMF48Qelq8Th4scioY8q9YueB40m/Kg1nEULJLzDAWoZ5bqlB6n2I1TuZoQ1pgt/qbOtjpnb06uAzhVE4T+m9cfzOntVFP1F7SV0W9sS9pEpKw8qNqYP3xx27j+mU5puG6sF0SlDtlO1MbmaEBrCPHMiw8nIfIwfSGLx9YRc62tndY+QC78L9Oru/n2+de5FN7G1N6pglKRVmXaba3q0121+SagqzYemW6tLn3SokbYiWIUwHdDC0fJtRHZJmBMY+NT1UJ2B9iZ2ukuYfQ+0IFXP6rnxUWsIs7aiwQGnLSTktMX5ffQO8ksZoK0UhYlCZDlBnepgywLZVeloskF2PRN8Zz4ssa/fLAabZNz3p42HPwn5vYjI+5jTMMDvBjzIV6ns5uN8tCo2BMSxcVLjmEUbfw3jZrvI8HIoOqN0enPIEoOLuhwyl1LLCtXKfvilRPZuiLPwJ53kF1jm4MqG1rcnIuqQW1kUy7klWjMJlTI8jtGBNMfeFHowuuJllUWhcaCRp0O/zvnRmHmZSTQPFjKGFOFl6KU51Q7BWtx21szh1tXFnGuEM/Z+1ancJniOx/SV40DSfwNX4UEgdVDMKt3R+S+cvGk5Qyysh704v7vZQtEAl5WSDbvVO+VU13w6V/J1Hsswk8j6e33pX4OC5xWrKLcPaFyieu4ZbPDa4+b8wu5RzBf2zi1nP9Dmr0M6DWQMJzrKQCUaJIqT4CAZhfjsrdOMFraC7yUbhmxcJpKpR1eDwj5hqvmsS0tSjIaEquCpOV6Hv5xRUhxqVlpRvpGQUud1iwnAPr+JToFQxAd6OCVrcfxVXZPn2nxaCq/8A</diagram></mxfile>

BIN
docs/figures/usr_pt.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 104 KiB

1
docs/figures/usr_pt.xml Normal file
View File

@@ -0,0 +1 @@
<mxfile modified="2019-02-11T15:37:37.098Z" host="www.draw.io" agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/9.3.1 Chrome/66.0.3359.181 Electron/3.0.6 Safari/537.36" etag="lYmM-42JGF0luk8BAXf0" version="10.2.2" type="device"><diagram id="tJx720wRuxbQNEWs5qEG" name="Page-1">7Vjbcps6FP0aPzoDiJsfbSfuSafttEkvc85LRwGBdSIjKuRbvr5bIDAXuXETZ5qH+iFBS0KCvdZe2mKE5qvdG4Hz5XseEzZyrHg3QpcjxwkmDvxVwL4CfDusgFTQuILsA3BLH4gGLY2uaUyKzkDJOZM074IRzzISyQ6GheDb7rCEs+6qOU7JALiNMBui32gslxUaOsEB/4fQdFmvbPuTqmeF68H6TYoljvm2BaGrEZoLzmV1tdrNCVOxq+NS3bc40ts8mCCZPOWG/N9FLL6FfHY9QeRt+vXDJ+9hrMnYYLbWL6wfVu7rCMAsEGxozLZLKsltjiPVswW6AVvKFYOWDZe4yCsGErojsOiM4TvCZji6TwVfZ/GcMy6gO+OZmi3hmdR8265u10NGDrIshHxf4ZSx3q0xLpZqhXJZ/QZESLI7Ghq7CTgIlfAVkWIPQ/QNjqc50iK16/b2QLlraWzZorsBsZZZ2sx9YAIuNBlmYsZR8ANfvp3z63e3/mZPbz7NrLHzksS0QwqhTsKIRBHghRT8nrR67kLPhVhosvTSx8JrIOFoxFHgdSLu1O1WxBsW2hFvwPNHPBiE/D3fEECUhLdYxAMGIKVzdVnQLGVkquxGqZMKCDflmerha/U6jzH060ypermIiTDlUEMLTMkFfQAMq4kVb6UF1sZlXbjOxPabX1DmXTlEJ6J1gQLkhy7Yr+f4qMrLrlhiTMLEKBY/Csld8otMPoNuXLubqa4z1E1okk097uyyQciQqT6TOg4dwfg/1rzuGBdlyKcwwA7zXRmeuh+uUvV/tk4SImDEf4puPSs8ZTVxNWYgSoik7Kqry1QtnaGtYkZTJdoI+IFl0UzxQmEvnOqOFY1jdsxwSuWWrjw0jK7bhyftDW1l+Rb8zqWgrvMgg4ICg4LO4TvTq0nOP38c///9y9SXb/brxfX92KSfvtNnsTaXy4jhoqBV/mEhh/DjrgIBXYTzq/m8R4sH7WgtNs3OWummdg/HwOrQAx5lisSdgmvIU4sH0wZQY4IwLOmmW6aZyNErfOS0zMZ6A5p0jQTZPYMA5xYR0Xe1y6reRJ79yERAU0rkYCJgDu9bw3I1oDj+wG6vRnE8rye9asaDEJuYnqRNY3k41KbeEm9KQn5/PyS4kH9oO7Rf53b4rMLVc3pmZg8L12bra2dReAY3MyrGPaaYdyR5imC25K9gziiY/knnzwvGOyaYBRdPLLozLv4W3WcUTb/oNh2PX6zoNorGH4jmhktc0v86y+FnfQppiirDKfyZxPZO4ZPg4rRq2HkpO5gY7WBFylKoOga9SoI7n6ROZzs0Ze5Lsd0reRsS295v+sr1hKMPNA9fNqtq9PB5GF39BA==</diagram></mxfile>

352
docs/report.latex Normal file
View File

@@ -0,0 +1,352 @@
\documentclass[conference]{IEEEtran}
% \IEEEoverridecommandlockouts
% The preceding line is only needed to identify funding in the first footnote.
% If that is unneeded, please comment it out.
\usepackage{cite}
\usepackage{amsmath,amssymb,amsfonts}
% \usepackage{algorithmic}
\usepackage{graphicx}
\usepackage{textcomp}
\usepackage{xcolor}
\usepackage{subcaption}
\usepackage{todonotes}
\usepackage{hyperref}
\def\BibTeX{{\rm B\kern-.05em{\sc i\kern-.025em b}\kern-.08em
T\kern-.1667em\lower.7ex\hbox{E}\kern-.125emX}}
\begin{document}
\title{Humanoid Robotic Systems - ``Teleoperating NAO''}
\author{Pavel Lutskov, Luming Li, Lukas Otter and Atef Kort}
\maketitle
\section{Project Description}
In this semester the task of our group was to program a routine for
teleoperation of a NAO robot. Using the ArUco markers, placed on the operator's
chest and hands, the position and the posture of the operator should have been
determined by detecting the markers' locations with a webcam, and then the
appropriate commands should have been sent to the robot to imitate the motions
of the operator. The overview of the
process can be seen in \autoref{fig:overview}. The main takeaway from
fulfilling this objective was practicing the skills that we acquired during the
Humanoid Robotic Systems course and to get familiar with the NAO robot as a
research and development platform.
\begin{figure}[h]
\centering
\includegraphics[width=\linewidth]{figures/teleoperation_overview.png}
\caption{Overview of the defined states and their transistions.}
\label{fig:overview}
\end{figure}
In closer detail, once the markers are detected, their coordinates relative to
the webcam are extracted. The position and the orientation of the user's
chest marker is used to control the movement of the NAO around the environment.
We call this approach a ``Human Joystick'' and we describe it in more detail in
\autoref{ssec:navigation}.
The relative locations of the chest and hand markers can be used to determine
the coordinates of the user's end effectors (i.e.\ hands) in the user's chest
frame. In order for the NAO to imitate the arm motions, these coordinates need
to be appropriately remapped into the NAO torso frame. With the knowledge of the
desired coordinates of the hands, the commands for the NAO joints can be
calculated by using the Cartesian control approach. We present a thorough
discussion of the issues we had to solve and the methods we used for arm motion
imitation in \autoref{ssec:imitation}.
Furthermore, in order to enable the most intuitive teleoperation, a user
interface was needed to be developed. In our system, we present the operator
with a current estimation of the operator's pose, a sensor feedback based robot
pose, as well as with the camera feed from both NAO's cameras and with the
webcam view of the operator. In order for the user to be able to give explicit
commands to the robot, such as a request to open or close the hands or to
temporarily suspend the operation, we implemented a simple voice command system.
Finally, to be able to accommodate different users and to perform control in
different conditions, a small calibration routine was developed, which would
quickly take a user through the process of setting up the teleoperation.
We elaborate on the tools and approaches that we used for implementation of the
user-facing features in \autoref{ssec:interface}.
An example task, that can be done using our teleoperation package might be the
following. The operator can safely and precisely navigate the robot through an
uncharted environment with a high number of obstacles to some lightweight
object, such as an empty bottle, then make the robot pick up that object and
bring the object back to the operator. Thanks to the high precision of the arm
motions and the constant operator input, the robot is able to pick up an object
of different shapes and sizes, applying different strategies when needed. We
demonstrate the functioning of our system in the supporting video.
We used ROS as a framework for our implementation. ROS is a well-established
software for developing robot targeted applications with rich support
infrastructure and modular approach to logic organization For interacting
with the robot we mainly relied on the NAOqi Python API. The advantage of using
Python compared to C++ is a much higher speed of development and a more concise
and readable resulting code.
\section{System Overview}
\subsection{Vision}\label{ssec:vision}
- Camera calibration
- Aruco marker extraction
- TF world coordinate publishing
\begin{figure}
\centerline{\includegraphics[width=0.8\linewidth]{figures/aruco.png}}
\caption{ArUco marker detection on the operator.}
\label{fig:aruco_detection}
\end{figure}
\subsection{Interface}\label{ssec:interface}
\paragraph{Speech State Machine}
Based on NAOqi API and NAO built-in voice recognition
\begin{table}
\caption{Commands of the speech recognition module}
\begin{center}
\begin{tabular}{|c|c|c|}
\hline
\textbf{Command}&\textbf{Action}&\textbf{Available in state} \\
\hline
``Go'' & Wake Up & Sleep \\
\hline
``Kill'' & Go to sleep & Idle, Imitation \\
\hline
``Arms'' & Start imitation & Idle \\
\hline
``Stop'' & Stop imitation & Imitation \\
\hline
``Open'' & Open hands & Idle, Imitation \\
\hline
``Close'' & Close hands & Idle, Imitation \\
\hline
\end{tabular}
\label{tab_speech_states}
\end{center}
\end{table}
\paragraph{Teleoperation Interface}
In order to make it possible to operate
the NAO without visual contact, a teleoperation interface was developed. This
interface allows the operator to receive visual feedback on the NAO as well as
additional information regarding his own position.
The NAO-part contains video streams of the top and bottom cameras on the robots
head. These were created by subscribing to their respective topics (FIND NAME)
using the \textit{rqt\_gui} package. Moreover, it also consists of a rviz
window which gives a visual representation of the NAO. For this, the robot's
joint positions are displayed by subscribing to the topic tf where the
coordinates and the different coordinate frames are published. We further used
the \textit{NAO-meshes} package to create the 3D model of the NAO.
\begin{figure}
\centering
%\hfill
\begin{subfigure}[b]{0.4\linewidth}
\includegraphics[width=\linewidth]{figures/rviz_human.png}
\caption{}
%{{\small $i = 1 \mu m$}}
\label{fig_human_model}
\end{subfigure}
\begin{subfigure}[b]{0.4\linewidth}
\includegraphics[width=\linewidth]{figures/interface_nao.png}
\caption{}
%{{\small $i = -1 \mu A$}}
\label{fig_nao_model}
\end{subfigure}
\caption{Operator and NAO in rviz.}
\label{fig_interface}
\end{figure}
\subsection{Navigation}\label{ssec:navigation}
- Human Joystick (3dof)
One of the two main feature in our robot is an intuitive navigation tool, which
allows the robot to navigate the environment by tracking the user movements.
By fixing an ArUco marker on the user's chest, we can continuously track its
position and orientation in a three dimensional space and so capture its
motion.
In order to simplify the task we define a buffer zone where the robot can only
track the orientation of the user then depending on which direction the user
will exit the zone the robot will either go forward, backward, left or right.
Also the covered distance will influence the speed of the robot, the further
the user is from the center of the buffer zone the faster the movement of the
robot will be. The extent of the movement and buffer zone are determined
automatically through calibration.
\begin{figure}
\centering
\includegraphics[width=0.8\linewidth]{figures/usr_pt.png}
\caption{User position tracking model}
\label{fig_user_tracking}
\end{figure}
\subsection{Imitation}\label{ssec:imitation}
One of the main objectives of our project was the imitation of the operator
arm motions by the NAO. In order to perform this, first the appropriate mapping
between the relative locations of the detected ArUco markers and the desired
hand positions of the robot needs to be calculated. Then, based on the
target coordinates, the robot joint rotations need to be calculated.
\paragraph{Posture retargeting}
First, let us define the notation of the coordinates that we will use to
describe the posture retargeting procedure. Let $r$ denote the 3D $(x, y, z)$
coordinates, then the subscript defines the object which has these coordinates,
and the superscript defines the coordinate frame in which these coordinates are
taken. So, for example, $r_{NAO hand}^{NAO torso}$ gives the coordinate of the
hand of the NAO robot in the frame of the robot's torso.
After the ArUco markers are detected and published on ROS TF, as was described
in \autoref{ssec:vision}, we have the three vectors $r_{aruco,chest}^{webcam}$,
$r_{aruco,lefthand}^{webcam}$ and $r_{aruco,righthand}^{webcam}$. We describe
the retargeting for one hand, since it is symmetrical for the other hand. We
also assume that all coordinate systems have the same orientation, with the
z-axis pointing upwards, the x-axis pointing straight into webcam and the
y-axis to the left of the webcam. Therefore, we can directly calculate the hand
position in the user chest frame by the means of the following equation:
$$r_{hand,user}^{chest,user} = r_{aruco,hand}^{webcam} -
r_{aruco,chest}^{webcam}$$.
Next, we remap the hand coordinates in the chest frame into the user shoulder
frame, using the following relation:
$$r_{hand,user}^{shoulder,user} =
r_{hand,user}^{chest,user} - r_{shoulder,user}^{chest,user}$$
We know the coordinates of the user's shoulder in the user's chest frame from
the calibration procedure, described in \autoref{ssec:interface}.
Now, we perform the retargeting of the user's hand coordinates to the desired
NAO's hand coordinates in the NAO's shoulder frame with the following formula:
$$r_{hand,NAO}^{shoulder,NAO} =
\frac{L_{arm,NAO}}{L_{arm,user}} r_{hand,user}^{shoulder,user}$$
As before, we know the length of the user's arm through calibration and the
length of the NAO's arm through the specification provided by the manufacturer.
A final step of the posture retargeting is to obtain the coordinates of the
end effector in the torso frame. This can be done through the following relation:
$$r_{hand,NAO}^{torso,NAO} =
r_{hand,NAO}^{shoulder,NAO} + r_{shoulder,NAO}^{torso,NAO}$$
The coordinates of the NAO's shoulder in the NAO's torso frame can be obtained
through a call to the NAOqi API.
Now that the desired position of the NAO's hands are known, the appropriate
joint motions need to be calculated by the means of Cartesian control.
\paragraph{Cartesian control}
For this a singular robust cartesian controller was build.
The output of our cartesian controller are the 4 angles of the rotational
joints for the shoulder and the elbow part of each arm of the NAO robot, which
is described by the inverse kinematic formula
$$\Delta\theta = J^{-1,robust}\Delta r$$
To build the cartesian controller first the Jacobian matrix is needed. The
content of the Jacobian matrix describes an approximation for the movement of
each joint of the robot. There are 2 main ways to determine the Jacobian
matrix. The first way is the numerical method, where this approximation is done
by checking how the end effector moves with small angles for rotational joints.
For this we can approximate each column of the Jacobian Matrix as followed:
$$\frac{\partial r}{\partial\theta} \sim \frac{\Delta r}{\Delta\theta} =
\left(
\begin{array}{ccc}
\frac{\Delta r_x}{\Delta\theta} &
\frac{\Delta r}{\Delta\theta} &
\frac{\Delta r}{\Delta\theta}
\end{array}
\right)^{T}$$
The other method is the analytical method, which was used in this project.
Since only rotational joints were available, the approximation for the
Jacobian matrix, which is the tangent in rotational joints, can be calculated
using the cross product between the rotational axis $e$ and the rotational
vector \\ $r_{end}-r_{joint}$.
$$
\frac{\partial r_{end}}{\partial\theta _{joint}} =
(e \times (r_{end}-r_{joint}))
$$
which gives us one column of the Jacobian matrix. This can be repeated for
each rotational joint until the whole matrix is filled.
The next step for the Cartesian controller is to determine the inverse Jacobian
matrix for the inverse kinematic. For this singular value decomposition is
used.
\section{System Implementation and Integration}
Now that the individual modules were designed and implemented, the whole system
needed to be assembled together. It is crucial that the states of the robot and
the transitions between the states are well defined and correctly executed. The
state machine, that we designed, can be seen in the \autoref{fig:overview}.
The software package was organized as a collection of ROS nodes, controlled by
a single master node. The master node keeps track of the current system state,
and the slave nodes consult with the master node to check if they are allowed
to perform an action. To achieve this, the master node creates a server for a
ROS service, named \verb|inform_masterloop|, with this service call taking as
arguments a name of the caller and the desired action and responding with a
Boolean value indicating, whether a permission to perform the action was
granted. The master node can then update the system state based on the received
action requests and the current state. Some slave nodes, such as the walking or
imitation nodes run in a high-frequency loop, and therefore consult with the
master in each iteration of the loop. Other nodes, such as the fall detector,
only inform the master about the occurrence of certain events, such as the fall
or fall recovery, so that the master could deny requests for any activities,
until the fall recovery is complete.
\section{Drawbacks and conclusions}
% \begin{table}[htbp]
% \caption{Table Type Styles}
% \begin{center}
% \begin{tabular}{|c|c|c|c|}
% \hline
% \textbf{Table}&\multicolumn{3}{|c|}{\textbf{Table Column Head}} \\
% \cline{2-4}
% \textbf{Head} & \textbf{\textit{Table column subhead}}& \textbf{\textit{Subhead}}& \textbf{\textit{Subhead}} \\
% \hline
% copy& More table copy$^{\mathrm{a}}$& & \\
% \hline
% \multicolumn{4}{l}{$^{\mathrm{a}}$Sample of a Table footnote.}
% \end{tabular}
% \label{tab_sample}
% \end{center}
% \end{table}
% \begin{thebibliography}{00}
% \bibitem{b1}
% G. Eason, B. Noble, and I. N. Sneddon,
% ``On certain integrals of Lipschitz-Hankel type involving
% products of Bessel functions,''
% Phil. Trans. Roy. Soc. London, vol. A247, pp. 529--551, April 1955.
% \end{thebibliography}
\end{document}

View File

@@ -1,4 +1,10 @@
#! /usr/bin/env python #! /usr/bin/env python
"""The script for calibrating the arm lengths and 'joystick' extents.
You will normally not call this script directly but rather with
`roslaunch teleoperation calibration.launch`
"""
import os import os
import json import json
@@ -12,6 +18,23 @@ from teleoperation.msg import RequestSpeechAction, RequestSpeechGoal
def calib_field(sentences, src_frames, targ_frames): def calib_field(sentences, src_frames, targ_frames):
"""Retrieve a relative transform of some marker.
Parameters
----------
sentences : list of str
What to say.
src_frames : list of str
The names of tf source frames (from where to transform)
targ_frames : list of str
The names of tf target frames (to where to transform)
Returns
-------
list of transforms
"""
ok = False ok = False
t_and_q = [] t_and_q = []
@@ -34,6 +57,7 @@ def calib_field(sentences, src_frames, targ_frames):
def calib_center(): def calib_center():
"""Retrieve the coordinates of the 'joystick' zero state."""
trans, q = calib_field([ trans, q = calib_field([
'Stand in front of camera', 'Stand in front of camera',
'Far enough to see your arms' 'Far enough to see your arms'
@@ -43,6 +67,7 @@ def calib_center():
def calib_arms(): def calib_arms():
"""Retrieve the hand positions relative to the chest."""
ts_and_qs = calib_field( ts_and_qs = calib_field(
['Now stretch your arms', 'Make sure the markers are detected'], ['Now stretch your arms', 'Make sure the markers are detected'],
['odom'] * 3, ['odom'] * 3,
@@ -58,6 +83,7 @@ def calib_arms():
def calib_shoulders(): def calib_shoulders():
"""Retrieve the shoulder positions relative to the chest."""
ts_and_qs = calib_field( ts_and_qs = calib_field(
['Now place the markers on the corresponding shoulders', ['Now place the markers on the corresponding shoulders',
'Make sure the markers are detected'], 'Make sure the markers are detected'],
@@ -74,6 +100,7 @@ def calib_shoulders():
def calib_rotation(): def calib_rotation():
"""Retrieve the limits of the z-axis of the 'joystick'."""
rots = [] rots = []
for side in ('left', 'right'): for side in ('left', 'right'):
_, q = calib_field( _, q = calib_field(
@@ -86,6 +113,7 @@ def calib_rotation():
def calib_extremes(): def calib_extremes():
"""Retrieve the limits of the x and y axes of the 'joystick'."""
transs = [] transs = []
for direction in ('forward', 'backward', 'to the left', 'to the right'): for direction in ('forward', 'backward', 'to the left', 'to the right'):
trans, _ = calib_field( trans, _ = calib_field(
@@ -100,6 +128,7 @@ def calib_extremes():
def calibration(): def calibration():
"""Run full calibration and dump the results to the config file."""
center = calib_center() center = calib_center()
larm, rarm = calib_arms() larm, rarm = calib_arms()
lsh, rsh = calib_shoulders() lsh, rsh = calib_shoulders()

View File

@@ -1,5 +1,6 @@
#! /usr/bin/env python #! /usr/bin/env python
"""Controller should execute control for a given effector""" """Library for executing control for a given effector"""
from __future__ import division from __future__ import division
import os import os
import json import json
@@ -28,6 +29,7 @@ NAO_ARM = 0.22
def xyz(joint): def xyz(joint):
"""Get 3D coordinate of a joint in torso frame."""
return np.array(mp.getPosition(joint, FRAME_TORSO, False), return np.array(mp.getPosition(joint, FRAME_TORSO, False),
dtype=np.float64)[:3] dtype=np.float64)[:3]
@@ -53,12 +55,14 @@ best_guess = {
def get_transform(joint): def get_transform(joint):
"""Retrieve a transform matrix of a joint in the torso frame."""
return np.array( return np.array(
mp.getTransform(joint, FRAME_TORSO, False), dtype=np.float64 mp.getTransform(joint, FRAME_TORSO, False), dtype=np.float64
).reshape((4, 4)) ).reshape((4, 4))
def diff_jacobian(side): def diff_jacobian(side):
"""Update and return the differential Jacobian. NOT GOOD."""
new_end_xyz = xyz('{}Arm'.format(side)) new_end_xyz = xyz('{}Arm'.format(side))
delta_r = new_end_xyz - crt_xyz[side] delta_r = new_end_xyz - crt_xyz[side]
crt_xyz[side] = new_end_xyz.copy() crt_xyz[side] = new_end_xyz.copy()
@@ -75,6 +79,7 @@ def diff_jacobian(side):
def jacobian(side): def jacobian(side):
"""Calculate the analytical Jacobian for side 'L' or 'R'."""
end_xyz = xyz('{}Arm'.format(side)) end_xyz = xyz('{}Arm'.format(side))
xyzs = np.array([xyz(side + j) for j in JOINTS]) xyzs = np.array([xyz(side + j) for j in JOINTS])
@@ -84,6 +89,7 @@ def jacobian(side):
ax_order = (y_axis, z_axis, x_axis, z_axis) ax_order = (y_axis, z_axis, x_axis, z_axis)
# Calculate the Jacobian after the formula from the report
axes = np.concatenate( axes = np.concatenate(
[get_transform(side + j).dot(ax) for j, ax in zip(JOINTS, ax_order)], [get_transform(side + j).dot(ax) for j, ax in zip(JOINTS, ax_order)],
axis=1 axis=1
@@ -94,6 +100,7 @@ def jacobian(side):
def to_nao(my_xyz, side): def to_nao(my_xyz, side):
"""Transform object coordinates from operator chest frame to NAO torso."""
sh_offs = np.array(MY_SHOULDERS[side]) sh_offs = np.array(MY_SHOULDERS[side])
my_sh_xyz = my_xyz - sh_offs my_sh_xyz = my_xyz - sh_offs
nao_sh_xyz = my_sh_xyz / ARM * NAO_ARM nao_sh_xyz = my_sh_xyz / ARM * NAO_ARM
@@ -102,6 +109,7 @@ def to_nao(my_xyz, side):
def inv_jacobian(j): def inv_jacobian(j):
"""Singluarity robust inverse Jacobian."""
u, s, vt = np.linalg.svd(j) u, s, vt = np.linalg.svd(j)
s_inv = np.zeros((vt.shape[0], u.shape[1])) s_inv = np.zeros((vt.shape[0], u.shape[1]))
np.fill_diagonal(s_inv, 1 / s) np.fill_diagonal(s_inv, 1 / s)
@@ -110,6 +118,22 @@ def inv_jacobian(j):
def _some_cartesian(my_xyz, side, jfunc): def _some_cartesian(my_xyz, side, jfunc):
"""A generic cartesian controller.
Parameters
----------
my_xyz : numpy.array
Coordinates of the object/target in the operator chest frame.
side : str, 'L' or 'R'
jfunc : func
A function that will return a jacobian matrix for the given side.
Returns
-------
numpy.array
The control to execute in the joint space.
"""
nao_xyz = to_nao(my_xyz, side) nao_xyz = to_nao(my_xyz, side)
delta_r = 0.1 * (nao_xyz - xyz('{}Arm'.format(side))) delta_r = 0.1 * (nao_xyz - xyz('{}Arm'.format(side)))
crt_q = mp.getAngles([side + j for j in JOINTS], False) crt_q = mp.getAngles([side + j for j in JOINTS], False)
@@ -118,19 +142,23 @@ def _some_cartesian(my_xyz, side, jfunc):
def our_cartesian(my_xyz, side): def our_cartesian(my_xyz, side):
"""Our cartesian controller."""
return _some_cartesian(my_xyz, side, jacobian) return _some_cartesian(my_xyz, side, jacobian)
def our_diff_cartesian(my_xyz, side): def our_diff_cartesian(my_xyz, side):
"""Our differential cartesian controller."""
return _some_cartesian(my_xyz, side, diff_jacobian) return _some_cartesian(my_xyz, side, diff_jacobian)
def _elbow(arm_): def _elbow(arm_):
"Dumb way to calculate the elbow roll."""
quot = min(1.0, arm_ / 0.70) quot = min(1.0, arm_ / 0.70)
return radians(180 * (1 - quot)) return radians(180 * (1 - quot))
def dumb(my_xyz, side): def dumb(my_xyz, side):
"""Our dumb controller that directly maps 3D coords into joint space."""
sign = 1 if side == 'L' else -1 sign = 1 if side == 'L' else -1
roll = asin(my_xyz[1] / np.linalg.norm(my_xyz)) roll = asin(my_xyz[1] / np.linalg.norm(my_xyz))
roll -= sign * radians(25) roll -= sign * radians(25)
@@ -142,11 +170,24 @@ def dumb(my_xyz, side):
def movement(my_xyz, side, control): def movement(my_xyz, side, control):
"""Execute the movement.
my_xyz : numpy.array
Coordinates of the object/target in the operator chest frame.
side : str, 'L' or 'R'
control : func
A function to calculate the conrol for the side, depending on the
target coordinates. It must return returning the `numpy.array`
joint vector of format
[ShoulderPitch, ShoulderRoll, ElbowYaw, ElbowRoll].
"""
ref = control(np.array(my_xyz), side).tolist() ref = control(np.array(my_xyz), side).tolist()
mp.setAngles([side + j for j in JOINTS], ref, 0.3) mp.setAngles([side + j for j in JOINTS], ref, 0.3)
def nao_cart_movement(my_arm_xyz, side, *_): def nao_cart_movement(my_arm_xyz, side, *_):
"""Execute the movement using the NAO cartesian controller."""
nao_xyz = to_nao(my_arm_xyz, side) nao_xyz = to_nao(my_arm_xyz, side)
mp.setPositions('{}Arm'.format(side), FRAME_TORSO, mp.setPositions('{}Arm'.format(side), FRAME_TORSO,
tuple(nao_xyz.tolist()) + (0, 0, 0), tuple(nao_xyz.tolist()) + (0, 0, 0),

View File

@@ -1,4 +1,10 @@
#! /usr/bin/env python #! /usr/bin/env python
"""A module for graceful fall handling.
ROS node: `fall_detector`.
Uses `inform_controller` service.
"""
import os import os
import rospy import rospy
@@ -15,6 +21,7 @@ _inform_masterloop = inform_masterloop_factory('fall_detector')
class FallDetectorModule(ALModule): class FallDetectorModule(ALModule):
"""An ALModule responsible for fall handling."""
def __init__(self, name): def __init__(self, name):
ALModule.__init__(self, name) ALModule.__init__(self, name)
@@ -23,7 +30,7 @@ class FallDetectorModule(ALModule):
_inform_masterloop('falling') _inform_masterloop('falling')
def on_robot_fallen(self, *_args): def on_robot_fallen(self, *_args):
if not _inform_masterloop('fallen'): if not _inform_masterloop('fallen'): # Seize the control
return return
mp.rest() mp.rest()
rospy.Rate(0.5).sleep() rospy.Rate(0.5).sleep()
@@ -32,9 +39,10 @@ class FallDetectorModule(ALModule):
am.setExpressiveListeningEnabled(False) am.setExpressiveListeningEnabled(False)
pp = ALProxy('ALRobotPosture') pp = ALProxy('ALRobotPosture')
while not pp.goToPosture('StandInit', 1.0): while not pp.goToPosture('StandInit', 1.0):
# Try to stand up indefinetely
pass pass
rospy.Rate(1).sleep() rospy.Rate(1).sleep()
_inform_masterloop('recovered') _inform_masterloop('recovered') # Release the control
if __name__ == "__main__": if __name__ == "__main__":

View File

@@ -1,4 +1,9 @@
#! /usr/bin/env python #! /usr/bin/env python
"""A small module for hand controls service.
ROS Node: `hand_ler`
Provides `hands` service.
"""
import os import os
import rospy import rospy
@@ -12,6 +17,12 @@ mp = None
def do_in_parallel(func): def do_in_parallel(func):
"""Open hands in parallel.
The robot's hand function blocks and only accepts one hand. This is a
thread-based hackaround.
"""
tl = Thread(target=func, args=('LHand',)) tl = Thread(target=func, args=('LHand',))
tr = Thread(target=func, args=('RHand',)) tr = Thread(target=func, args=('RHand',))
tl.start() tl.start()
@@ -21,6 +32,7 @@ def do_in_parallel(func):
def handle_hands(request): def handle_hands(request):
"""Hand service routine."""
if request.target == 'open': if request.target == 'open':
do_in_parallel(mp.openHand) do_in_parallel(mp.openHand)
return HandsResponse('opened') return HandsResponse('opened')

View File

@@ -1,4 +1,10 @@
#! /usr/bin/env python #! /usr/bin/env python
"""The node performing the imitation.
ROS Node: `imitator`
Uses `inform_masterloop` service.
"""
from argparse import ArgumentParser from argparse import ArgumentParser
from math import radians from math import radians
@@ -18,6 +24,7 @@ TORSO = False
def controller_factory(ctrl): def controller_factory(ctrl):
"""Create a controller depending on the command line argument."""
if ctrl == 'nao_cartesian': if ctrl == 'nao_cartesian':
return lambda my_arm_xyz, side: nao_cart_movement(my_arm_xyz, side) return lambda my_arm_xyz, side: nao_cart_movement(my_arm_xyz, side)
@@ -45,8 +52,8 @@ if __name__ == '__main__':
ll = tf.TransformListener() ll = tf.TransformListener()
while not rospy.is_shutdown(): while not rospy.is_shutdown():
rospy.Rate(20).sleep() rospy.Rate(20).sleep() # Run at 20 Hz or something
if not _inform_masterloop('imitate'): if not _inform_masterloop('imitate'): # Try to seize the control
continue continue
rospy.logdebug('IMITATOR: ACTIVE') rospy.logdebug('IMITATOR: ACTIVE')
@@ -61,7 +68,7 @@ if __name__ == '__main__':
# except Exception as e: # except Exception as e:
# rospy.logwarn(e) # rospy.logwarn(e)
for i, side in enumerate(['L', 'R'], 1): for i, side in enumerate(['L', 'R'], 1): # One arm at a time
try: try:
a0, _ = ll.lookupTransform( a0, _ = ll.lookupTransform(
'odom', 'odom',
@@ -76,9 +83,12 @@ if __name__ == '__main__':
except Exception as e: except Exception as e:
rospy.logwarn(e) rospy.logwarn(e)
continue continue
# Calculate the position of the hand in my chest frame
my_arm_xyz = np.array(arm) - np.array(a0) my_arm_xyz = np.array(arm) - np.array(a0)
# rospy.loginfo('{}'.format(my_arm_xyz))
# rospy.loginfo('{}'.format(dumb(my_arm_xyz, side))) imitation_cycle(my_arm_xyz, side) # Do the actuation
imitation_cycle(my_arm_xyz, side)
# Set hands to sane positions
mp.setAngles(('LWristYaw', 'RWristYaw'), mp.setAngles(('LWristYaw', 'RWristYaw'),
(radians(-70), radians(70)), 0.3) (radians(-70), radians(70)), 0.3)

View File

@@ -1,4 +1,6 @@
#! /usr/bin/env python #! /usr/bin/env python
"""The master state machine node."""
import os import os
from argparse import ArgumentParser from argparse import ArgumentParser
@@ -37,6 +39,7 @@ mp = ALProxy('ALMotion', os.environ['NAO_IP'], 9559)
def init_voc_state_speech(): def init_voc_state_speech():
"""Initialize the transitions."""
global VOC_STATE, SPEECH_TRANSITIONS global VOC_STATE, SPEECH_TRANSITIONS
VOC_STATE = { VOC_STATE = {
'idle': [KILL] + ([IMITATE] if not AI else []), 'idle': [KILL] + ([IMITATE] if not AI else []),
@@ -55,6 +58,7 @@ def init_voc_state_speech():
def hands(what): def hands(what):
"""Handle the command to do something with the hands."""
try: try:
_hands = rospy.ServiceProxy('hands', Hands) _hands = rospy.ServiceProxy('hands', Hands)
newstate = _hands(what).newstate newstate = _hands(what).newstate
@@ -64,6 +68,7 @@ def hands(what):
def speech_done_cb(_, result): def speech_done_cb(_, result):
"""Handle the recognized command."""
global speech_in_progress, state, hand_state global speech_in_progress, state, hand_state
_state_old = state _state_old = state
rospy.loginfo('SPEECH: {}'.format(result)) rospy.loginfo('SPEECH: {}'.format(result))
@@ -83,6 +88,12 @@ def speech_done_cb(_, result):
def inform_masterloop_factory(who): def inform_masterloop_factory(who):
"""Create a function to inform the masterloop.
Every node infroms the master, so it's nice to have a factory for
these functions.
"""
def inform_masterloop(what): def inform_masterloop(what):
try: try:
inform_masterloop = rospy.ServiceProxy('inform_masterloop', inform_masterloop = rospy.ServiceProxy('inform_masterloop',
@@ -96,6 +107,10 @@ def inform_masterloop_factory(who):
def handle_request(r): def handle_request(r):
"""Handle a node's request to seize/release control.
Update the state if needed.
"""
global state global state
module = r.module module = r.module
message = r.message message = r.message
@@ -164,6 +179,8 @@ if __name__ == '__main__':
ic = rospy.Service('inform_masterloop', InformMasterloop, handle_request) ic = rospy.Service('inform_masterloop', InformMasterloop, handle_request)
# Necessary initializations all have been performed by now
def _shutdown(): def _shutdown():
if speech_in_progress: if speech_in_progress:
speech.cancel_goal() speech.cancel_goal()
@@ -171,7 +188,7 @@ if __name__ == '__main__':
rospy.on_shutdown(_shutdown) rospy.on_shutdown(_shutdown)
while not rospy.is_shutdown(): while not rospy.is_shutdown(): # Run the speech detection loop
if state in ('idle', 'imitate', 'dead'): if state in ('idle', 'imitate', 'dead'):
if not speech_in_progress: if not speech_in_progress:
speech_in_progress = True speech_in_progress = True
@@ -184,4 +201,4 @@ if __name__ == '__main__':
speech.cancel_goal() speech.cancel_goal()
speech_in_progress = False speech_in_progress = False
rospy.Rate(10).sleep() rospy.Rate(10).sleep() # Run at 10 Hz or something

View File

@@ -1,4 +1,10 @@
#! /usr/bin/env python #! /usr/bin/env python
"""A module for speech detection.
ROS Node: `speech_server`
Provides `request_speech` action
"""
import os import os
import rospy import rospy
@@ -13,6 +19,7 @@ almem = None
def request_speech(goal): def request_speech(goal):
"""Handle the request for speech detection."""
rospy.loginfo('SPEECH SERVER: NEW GOAL: {}'.format(goal)) rospy.loginfo('SPEECH SERVER: NEW GOAL: {}'.format(goal))
if not speech_detector.start_speech(goal.vocabulary): if not speech_detector.start_speech(goal.vocabulary):
sas.set_succeeded(RequestSpeechResult(word='')) sas.set_succeeded(RequestSpeechResult(word=''))
@@ -33,6 +40,7 @@ def request_speech(goal):
class SpeechDetectorModule(ALModule): class SpeechDetectorModule(ALModule):
"""ALModule for accessing the NAO's speech recognition."""
def __init__(self, name): def __init__(self, name):
ALModule.__init__(self, name) ALModule.__init__(self, name)
@@ -52,6 +60,7 @@ class SpeechDetectorModule(ALModule):
return almem.getData('ALSpeechRecognition/Status') return almem.getData('ALSpeechRecognition/Status')
def start_speech(self, voc, resume=False): def start_speech(self, voc, resume=False):
"""Start recognizing the given vocabulary."""
if self._busy != resume: if self._busy != resume:
return False return False
if not resume: if not resume:
@@ -63,14 +72,17 @@ class SpeechDetectorModule(ALModule):
return True return True
def have_word(self): def have_word(self):
"""Check if something was recognized."""
return self.recognized is not None return self.recognized is not None
def get_recognized_and_erase(self): def get_recognized_and_erase(self):
"""Retrieve the recognized word and erase it from the variable."""
result = self.recognized result = self.recognized
self.recognized = None self.recognized = None
return result return result
def stop_speech(self, pause=False): def stop_speech(self, pause=False):
"""Stop detecting or just pause detection."""
if not self._busy: if not self._busy:
return return
self.asr.pause(True) self.asr.pause(True)
@@ -79,6 +91,7 @@ class SpeechDetectorModule(ALModule):
self._busy = False self._busy = False
def on_word_recognized(self, *_args): def on_word_recognized(self, *_args):
"""Callback for word recognized event."""
word, conf = almem.getData('WordRecognized') word, conf = almem.getData('WordRecognized')
if conf > 0.4: if conf > 0.4:
self.stop_speech(pause=True) self.stop_speech(pause=True)

View File

@@ -1,4 +1,10 @@
#! /usr/bin/env python #! /usr/bin/env python
"""Module for walking in the environment (Human Joystick).
ROS Node: `walker`.
Uses `inform_masterloop` service.
"""
from __future__ import division from __future__ import division
import os import os
@@ -23,6 +29,7 @@ VMAX = 1.0
def n_way(a, b, n=3): def n_way(a, b, n=3):
"""A point which is (b - a)/n away from a."""
return a + (b - a) / n return a + (b - a) / n
@@ -45,6 +52,7 @@ _inform_masterloop = inform_masterloop_factory('walker')
def _speed(pos, interval): def _speed(pos, interval):
"""Calculate speed based on the operators position."""
int_dir = 1 if interval[1] > interval[0] else -1 int_dir = 1 if interval[1] > interval[0] else -1
if int_dir * (pos - interval[0]) < 0: if int_dir * (pos - interval[0]) < 0:
return 0.0 return 0.0
@@ -81,6 +89,7 @@ if __name__ == '__main__':
ref_vec = trans[:2] + rot[2:] ref_vec = trans[:2] + rot[2:]
#-1 1 -1 1 -1 1 #-1 1 -1 1 -1 1
for i, dr in enumerate((BK, FW, RT, LT, RR, LR)): for i, dr in enumerate((BK, FW, RT, LT, RR, LR)):
# figure out in which direction to move
idx = i // 2 idx = i // 2
sign = 1 if (i % 2) else -1 sign = 1 if (i % 2) else -1
speed = _speed(ref_vec[idx], dr) speed = _speed(ref_vec[idx], dr)
@@ -94,16 +103,17 @@ if __name__ == '__main__':
if not any(movement): if not any(movement):
rospy.logdebug('WALKER: STOP') rospy.logdebug('WALKER: STOP')
_inform_masterloop('stop') _inform_masterloop('stop') # release the control
# mp.move(0, 0, 0) # mp.move(0, 0, 0)
mp.stopMove() mp.stopMove()
continue continue
permission = _inform_masterloop('move') permission = _inform_masterloop('move') # try to seize the control
if not permission: if not permission:
mp.stopMove() mp.stopMove()
else: else:
# set the head so that the scene is clearly visible on cameras
mp.setAngles(('HeadYaw', 'HeadPitch'), (0.0, 0.35), 0.3) mp.setAngles(('HeadYaw', 'HeadPitch'), (0.0, 0.35), 0.3)
mp.moveToward(*movement) # DON'T DELETE THIS ONE mp.moveToward(*movement) # DON'T DELETE THIS ONE
pass pass

View File

@@ -12,6 +12,17 @@
#include <aruco/aruco.h> #include <aruco/aruco.h>
#include <teleoperation/utils.hpp> #include <teleoperation/utils.hpp>
/*
* This is the ROS Node for detecting the ArUco markers.
*
* Node: aruco_detector
*
* It subscribes to a webcam, detects the markers, transforms them
* into the correct coordinate frame (z-up) and publishes on TF
* relatively to the odom frame
*
*/
static const std::string ARUCO_WINDOW = "Aruco window"; static const std::string ARUCO_WINDOW = "Aruco window";
class ImageConverter { class ImageConverter {
@@ -41,6 +52,7 @@ class ImageConverter {
} }
void imageCb(const sensor_msgs::ImageConstPtr& msg) { void imageCb(const sensor_msgs::ImageConstPtr& msg) {
// This function gets called when an image arrives
cv_bridge::CvImageConstPtr cv_ptr; cv_bridge::CvImageConstPtr cv_ptr;
try try
{ {
@@ -54,7 +66,7 @@ class ImageConverter {
return; return;
} }
// Do Aruco // Define parameters of markers and camera
cv::Mat cam_mat = (cv::Mat_<double>(3, 3) << cv::Mat cam_mat = (cv::Mat_<double>(3, 3) <<
690., 0.0, 320, 690., 0.0, 320,
0.0, 690., 225, 0.0, 690., 225,
@@ -70,7 +82,7 @@ class ImageConverter {
float m_size = 0.15; float m_size = 0.15;
detector.detect(cv_ptr->image, markers, cam, m_size); detector.detect(cv_ptr->image, markers, cam, m_size);
// Draw and print aruco // Draw all aruco into the window and publish on tf
cv::Mat aruco_demo = cv_ptr->image.clone(); cv::Mat aruco_demo = cv_ptr->image.clone();
for (int i = 0; i < markers.size(); i++) { for (int i = 0; i < markers.size(); i++) {
@@ -79,6 +91,8 @@ class ImageConverter {
cv::Mat rot_cv; cv::Mat rot_cv;
cv::Rodrigues(markers[i].Rvec, rot_cv); cv::Rodrigues(markers[i].Rvec, rot_cv);
// Transform the coordinates of the aruco markers so that they
// are in the Z-up, X-forward frame
tf::Matrix3x3 rot( tf::Matrix3x3 rot(
rot_cv.at<float>(0, 0), rot_cv.at<float>(0, 0),
rot_cv.at<float>(0, 1), rot_cv.at<float>(0, 1),
@@ -106,7 +120,7 @@ class ImageConverter {
0, -1, 0 0, -1, 0
); );
rot = hmat * rot * another; // This all somehow works as expected rot = hmat * rot * another; // This transforms the coords
trans = hmat * trans; trans = hmat * trans;
tf::Transform aruco_tf(rot, trans); tf::Transform aruco_tf(rot, trans);
@@ -115,6 +129,7 @@ class ImageConverter {
id += stuff_to_str(markers[i].id); id += stuff_to_str(markers[i].id);
id += "_frame"; id += "_frame";
// Publish the transform
this->aruco_pub.sendTransform(tf::StampedTransform( this->aruco_pub.sendTransform(tf::StampedTransform(
aruco_tf, ros::Time::now(), "odom", aruco_tf, ros::Time::now(), "odom",
id.c_str())); id.c_str()));