Saturday, May 4, 2013

Testing and Analysis

The tests that were performed during the course of the project development were:
  • Unit testing of each software module developed.
  • Control of robot: Initial test involved sending commands from a PC to check if the robot behaves as expected. Next, we tested the control of the robot from the Android application. 
  • Resource utilization tests:
    • Battery usage test against theoretical calculations based on the speci cations.
    • Memory usage tests
  • Ping Accuracy test


The total heap memory used by the android application is about 2.88MB which is about 48% of the total memory. The memory allocated is mainly to transmit the data from the android application to the robot.


To test the accuracy of the ping sensor we used an obstacle of dimensions 30cm * 20 cm. The obstacle was placed at different distances from the ping sensors and the output values were noted. The table below gives values obtained during the test. It is clear from the observed values that the accuracy of ping sensor reduces for very large distances

Actual Distance(cm) Ping Value(cm)
5 6
10 11
15 16
25 26
36 35
72 71
95 92


Battery Used: 9V Duracell batteries (2 No). 
Two 9V batteries were used along with another Duracell 5V source which powered the Beagle bone.

Tests were conducted to check the life of the 9V battery sources.
The current ratings of the various peripherals are :

  • Wi Adapter 500 mA(max)
  • Webcam 150mA (Operating)
  • Arduino 40 -50 mA (Assumed)
  • Motors 200 mA
  • Ping Sensor 20 mA

Application Screens

In this section, we explain the di erent screens of the Android app and how to navigate them.


Figure 1 shows the opening screen of the app. This is the first screen that is loaded when the application is launched. This screen gives the user two options . One is to connect to the robot by clicking the START button , or to quit the application by clicking the QUIT button. When the user presses the START button the android application initializes the communication sockets with the robot and establishes a connection with the robot.

Figure 1. Screenshot showing the opening screen of the App


Once the user clicks the start button on the opening screen the application opens the Main Screen of the android application . If the user is able to connect to the robot then it gives the user to start controlling the robot. If the android application isn't able to connect to the robot then a dialog box is shown displaying that the connection to the robot couldn't be established and asks the user to try again, and takes the user back to the Opening Screen.

If the application establishes a connection with the robot and the user presses start on the dialog box, then the application displays the video on the screen and the ping distance on the top of the screen. The slider to determine the speed of the robot is to the right of the screen. When the application loads the main screen for the first time the slider is in the middle to indicate that robot is in not being powered to go forward nor backward. This activity is FULL SCREEN Activity and the main view item of the activity is the video view. The ping distance and the slider have been added on top of the video view using the fragment layouts. Also the PAUSE and the SETTINGS button have been added using the Fragment Layout . The two buttons are displayed when the user taps on any part of the screen other than the slider.

Figure 2. Screenshot showing the main screen of the App

In Figure 2 we can see that the video view (GstreamerSurfaceView) is the main view of the full screen activity and the other components of the screen have been added on top of that view. When the user presses the PAUSE button the application sends a STOP signal to the robot and stops sending/receiving data over the sockets to the robot. The accelerometers also get disabled. A dialog box is displayed giving the user the option either to play again or go back to the opening screen. If the user selects the play button then the video starts streaming again and the inputs from the user are sent to the robot. If the user presses back, the user is taken back to the opening screen and all the connections with the robot are closed immediately. 

When the user presses the SETTINGS button the settings activity is started and the robot is stopped till the user returns from the settings activity. When the user returns from the settings activity the user again has the option to either play or go back to the opening screen. 


The settings screen gives the user option to edit the settings related to application . The first option is to either enable/disable the ping sensor on the android application . The second option is to select the distance at which the robot has to come to halt in front of an obstacle. The user can select from 10 cm, 15 cm, 20 cm, 30 cm and 40 cm. Once the user selects the options and presses back button on the settings screen the application transmits the selections back to the robot immediately. This activity extends a PreferenceActivity and the uses Shared Preferences to store the options of the

Communication over Sockets

The android application communicates with the robot over TCP and UDP sockets. There are three main sockets on the android application side which are all clients to the servers created on the robot.
The three sockets are used for:
  • TCP Socket for the Video Stream: This socket connects to the TCP server which is hosting the video stream. The TCP socket connects to the server over the PORT:5000 . The video stream is transmitted over this socket connection.
  • UDP Socket for the sending the control data from the application to the robot: The UDP socket is used to send the accelerometer data as well as the slider input to the robot. The UDP socket was chosen so that the there is no delay in the transmission of the control signals from the application to the robot. The same socket is also used to send the ping sensor distance readings to the android application. The application receives the ping data over the this socket and it displays the data on the User Interface.
  • TCP Socket for Commands: This third socket is used to send start and stop signals to the robot. The start signal is encoded as hex byte 0x02. When the robot receives this signal it opens the UDP port and starts listening for control data from the application and also starts transmitting the ping distance. The hex byte 0x01 is used to inform the robot to stop. The table given below shows messages and their corresponding codes:
    Commands Code
    Start 0x02
    Stop 0x01
    Ping Enable 0x50
    Ping Disable 0x51
    Stop at 10cm 0x0A
    Stop at 15cm 0x0F
    Stop at 20cm 0x14
    Stop at 30cm 0x1E
    Stop at 40cm 0x28

Android Application

The android application developed has a very intuitive and simple User Interface . This application allows the user to control the robot as well as watch the video stream on the phone.


The main functionalities on the android application are :

1. The robot can be controlled using the phone.
2. The video stream from the robot can be watched on the android phone.
3. The user can connect and disconnect from the robot at any point of time
4. The application displays the distance to the nearest obstacle
5. The application allows the user to turn on/o the ping sensors The user can also set the
distance at which the robot needs to come to halt from the obstacle


The robot movements are controlled based on the inputs from the user on the phone. The movement of the robot forward or backward is controlled by the slider bar which is present on the Main Screen of the phone. By pushing the slider forward the user can make the robot move forward and by sliding the slider backwards the user can make the robot move backwards. The side ways movement of the robot is controlled by the accelerometer inputs from the phone. The user can tilt the phone to the right to make the robot turn right or tilt the phone to the left to make the robot turn left. The amount of turn that the robot makes depends on the amount of tilt. The robot tries to maintain a constant speed of one wheel and will reduce the speed of the other wheel. The amount by which the speed of the other wheel is reduced depends on the amount of tilt. If the tilt is very high, then the wheel will start to turn in the opposite direction. The slider ensures that the robot moves with the same speed as long the slide remains in the same position. This enables us to maintain a constant speed.


The video stream is transmitted by the robot over a TCP connection on PORT 5000. To obtain the video stream on the Android application the Gstreamer SDK has been used . The native functions of the SDK have been used to initialize, play and pause the video. The user can watch the live video stream on the Main Screen of the application. When the user decides the pause the robot, the video stream also pauses. The video stream starts playing once the user starts controlling the robot again.


The user can connect and disconnect from the robot at any point of time. When the user connects to the robot , the three sockets are opened and connected to the server on the robot. If the sockets cannot be opened or the communication channel between the robot and the phone cannot be established then a dialog box is displayed saying that the application couldn't connect to the robot.  At any point of time once the connection is established with the robot the user can disconnect from the robot. When the user decides to disconnect from the robot the application ensures that the robot is brought to a stop and then closes the socket connection with robots. The user can reconnect to the robot at any point of time.


The application displays the distance to the obstacle on the Main screen of the application. The distance is displayed on the top of the screen. When the ping sensors have been disabled the distance shown by the ping sensor is 0 cm.

Disable/Enable Ping Sensor:

The user can either disable or enable ping sensor from the android application. To do so the user has to open the settings menu of the application and select/deselect the check-box pertaining to the ping sensor.

Distance at which the robot has to come to a halt: 

The user can select the distance at which the robot has to come to a halt . The user has five options: 10 cm, 15 cm, 20 cm, 30 cm, 40 cm. To do so the user has to open the settings menu of the application and select the distance at which the robot has to come to a stop.

Friday, May 3, 2013

Ping Sensor Interfacing

The interfacing of the ping sensor with the Beaglebone was a bit more challenging due to the difference in the   operating voltage levels between the Beaglebone and the Ping sensor. To overcome this an Arduino microcontroller was placed in between, to convert the voltage levels from 3.3V to 5V and vice versa.

The figure below shows the block diagram for interfacing of the ping sensor.


The ping sensor waits for a 5 ms pulse on its signal pin. When it receives a pulse, it sends out ultra sonic pulses for 750 ms and listens for the echo. The output of the ping sensor is a high pulse whose width depends on the distance to the obstacle.

There are three software components that are used to interface the ping sensor.
1. An Arduino which waits for the signal from the Beaglebone and starts the ping operation.
2. A User space program to send a signal to the Arduino, to start the ping operation
3. A kernel driver on the Beaglebone to handle GPIO interrupts and to obtain accurate timing measurements.


The Arduino waits for a high pulse on Pin 8. Upon receiving this, it sends out a pulse to the ping sensor, and the ping sensor starts the ping operation. The Arduino then waits until the signal goes low on its ping pin(Pin 11) and sends out a pulse to the Beaglebone on Pin 9.


The program running on Beaglebone has two parts. One is the user space program and the other is the interrupt handler that is running in the kernel driver. To start the ping operation, the Beaglebone sends out a pulse on GPIO1_6 to the Arduino. It then does a read operation on /dev/gpioInt . This is the node that is associated with the custom driver to handle GPIO interrupts. When the read function is called, the control is transferred to the kernel where a timestamp is taken and the calling process is put to sleep. When the Beaglebone receives the pulse from the Arduino, the interrupt handler in the kernel driver is called. Here, another timestamp is taken, the process that was put to sleep is woken up and the difference in the timestamps is passed to the user space program.


The robot has the ability to slowly come to a halt when it detects an obstacle. The user can set the distance from the obstacle at which the robot has to come to a halt. Initially, the ping sensor is used every 1 second. When an obstacle is detected within 20cm of the set range, the robot starts to ping faster (every 200ms). Depending on the distance to the obstacle, the program generates its own value for the constant forward speed. This is compared to the value obtained from the Android Application (User controls based on slider) and the minimum of these two is used to set the forward speed of the robot.

Monday, March 25, 2013

Video streaming using Gstreamer

A major part of our project is to stream live video from a camera mounted on the Bot to the Android phone. We are using "Microsoft Lifecam Vx-5000" connected to Beaglebone via USB to capture the video.
Gstreamer is an open souce multi-media framework and is widely used for media streaming. It is well documented and provides an easy to use command line interface. The major hurdle in using Gstreamer was to find the right set of commands to reduce the lag in the streaming to less than a second.

Gstreamer Commands Used:
 gst-launch v4l2src device=/dev/video0 ! videorate ! video/x-raw-yuv,width=160,height=120,framerate=6/1 !   
 jpegenc quality=30 ! multipartmux ! tcpserversink port=5000  

Client(PC and Android):
 gst-launch tcpclientsrc host=localhost port=5000!multipartdemux!jpegdec!autovideosink  

We used very low resolution (160 X 120) to achieve the low lag requirement. Using these commands, the lag in the streaming was reduced to less than half a second.

Gstreamer on Android:

We followed the tutorial given here to install Gstreamer on Android. We had to install JNI and Gstreamer SDK to build Gstreamer projects in Eclipse. After configuring Eclipse, we made small modifications (changed the string in the gst_parse_launch to the client command given above) to this tutorial to make the app work according to our requirement.

Saturday, March 23, 2013

PWM on Beaglebone- Interfacing of Servo Motor

The parallax boe bot comes with a continuous servo motor. To drive the motor, we required PWM output from the Beaglebone. The Beaglebone has three PWM modules, each with 2 outputs. Accessing the PWM modules is very easy. The tutorial found here gave us detailed step by step instructions on how to configure the PWM module.

The servo motor requires a PWM wave of 50Hz for proper operation. The speed and direction of rotation is controlled by the duty cycle of the PWM wave.

  1. Duty cycle of 1500us stops the motor
  2. Duty cycle of 1300us makes the motor rotate in one direction
  3. Duty cycle of 1700us makes the motor rotate in the other direction
The speed of rotation varies linearly as the duty cycle, i.e. a duty cycle of 1400us makes the motor rotate at half speed and so on. The speed remains constant if the duty cycle is reduced below 1300us or increased above 1700us.

We used the two outputs of EHRPWM0 (Enhanced High Resolution PWM) module to control the two servo motors of the bot. The PWM outputs are connected to pins 29 and 31 of header P9 on the Beaglebone. To interface the servo motor:
  1. Connect VCC, Ground and PWM output pin to appropriate leads of the motor
  2. Configure MUX settings, so that pin 29 outputs PWM wave
     echo 1 > /sys/kernel/debug/omap_mux/mcasp0_aclkx  
  3. Request EHRPWM0:0 from the OS
     echo 1 > /sys/class/pwm/ehrpwm.0:0/request  
  4. Set PWM frequency
     echo 50 > /sys/class/pwm/ehrpwm.0:0/period_freq  
  5. Set PWM duty cycle
     echo 1300000 > /sys/class/pwm/ehrpwm.0:0/duty_ns  
  6. Run
     echo 1 > /sys/class/pwm/ehrpwm.0:0/run  
These steps were followed to output PWM wave(period of 20ms and duty cycle of 1300us) from the terminal. The same was then converted into C program using file operations

Thursday, March 21, 2013

GPIO on Beaglebone- Blinking LEDs

The first program that we wrote, was the famous "Hello World" of the embedded universe, .i.e. blinking of a led connected to a GPIO port. Access to GPIO ports is provided through the sysfs interface on Angstrom. To blink an led we must:

  1. Look up GPIO number of the pin in the Beaglebone reference manual. For example, pin 3 on P8 is connected GPIO1_6. To obtain the GPIO number we must multiply the port number by 32 and add the pin number. So GPIO1_6 would be 1x32 + 6 = 38
  2. Next we must export this pin, so that it is accessible from user space.
     echo 38 > /sys/class/gpio/export  
  3. Set the direction of the pin as output
     echo out > /sys/class/gpio/gpio38/direction  
  4. Write values to the 'value' file
     echo 1 > /sys/class/gpio/gpio38/value  
The above steps were followed to blink an led from the terminal. The same was then converted into a C program using C file operations (fopen, fprintf)

Wednesday, March 20, 2013

Getting Wifi - Connectivity on the beagle bone

Getting the Wifi connectivity on the beagle bone was a tricky process . It took us some time to get to know the packages that have to be installed to get Wi-fi Connectivity on the beagle bone.
We are using the Belkin F5D7050 USB Wi-fi adapter to connect to the Wi-fi networks. The adapter is connected to the USB port on the beagle bone.
After doing a lot of Google search we found that we had to install two packages onto the Linux distribution to get Wifi connectivity on the beagle bone .

  1. The first was the firmware required for the Wi-fi adapter . The firmware required for this is rt73-firmware. The firmware was downloaded from the angstrom distribution webpage. (
  2. The second package that we installed was wireless-tools which was also downloaded from the angstrom distribution webpage.
Once we had installed the two packages we had to specify the network ssid and the passphrase in the /etc/wpa_supplicant.conf file. Once this was done we just restarted the beagle bone and did ifup wlan0 to get the wifi connection up on the beagle bone. 

Sunday, March 17, 2013

Installation of Ångström OS on the beagle bone

The installation of the Angstrom OS on the Beagle bone was a quite a simple process . 
  1. Download the latest Angstrom Distribution image onto your PC . (Available at
  2. Connect the micro SD card to your PC .
  3. Write the image to the micro SD card. 
  4. Connect the micro SD card to the beagle bone and the power the beagle bone. 
  5. ssh into the beagle bone to access the files on the beagle bone.
We followed the tutorial on this page .

Friday, March 15, 2013


In this project, we intend to develop a remote control robot with a webcam mounted on it that transmits a live video feed to an Android application on the user's phone. The Android application will be used to control the robot.

Embedded Side:

A beagle bone will be mounted on a small robot. The output pins of the beagle bone will be connected to the actuators of the motor to control the robot. A camera will be mounted on the robot, which will be interfaced with the beagle bone via USB. There will be ping sensors attached on the front of the robot to detect any obstacles within its range. A Wi-fi adapter will be used to connect the beagle bone to the network.

Android Side:

The user will control the robot using an android application. The live video feed from the robot will be rendered on the android application. The user will use the phone's accelerometer to control the Bot.


Given below is the list of hardware and software components that will be used in the project.

  1. Beaglebone
  2. Belkin F5D7050 Wi-fi adapter
  3. Parallax Boebot 
  4. Webcam
  5. Parallax ping sensors
  6. Android phone
  1. Angstrom OS for Beaglebone
  2. Gstreamer for streaming
  3. Eclipse and ADT tools

Friday, March 1, 2013


Welcome to our blog. We are a team of 3 students at the University of Pennsylvania, who have undertaken a project to build an Android controlled robot as part of the Embedded systems programming course