Tuesday, 6 September 2016

Raspberry Pi Webcam Robot – Best Video Streaming Tutorial


 

 

Raspberry pi webcam robot DIY Hacking
Raspberry pi webcam robot
It is always cool to add a camera to your remote controlled car or robot, so that you can see where its heading exactly and probably use a wifi enabled smartphone or tablet to view the video as well.Its very simple now to do this using a raspberry pi and a USB webcam.Real time streaming can be done with minimum delay too. Here, steps are given on how to view the feed from your webcam on the monitor connected to the raspberry pi as well as on another device in the same local network. This tutorial will show you how you can add a camera and video broadcasting system and make a raspberry pi webcam robot . Have fun with this DIY Hacking tutorial!
Raspberry pi webcam robot DIY Hacking
USB webcam
What are the stuff required to do this project? Hardware:
  1. Raspberry Pi model B with memory card preloaded with an OS.
  2. WiFi dongle : Edimax EW 7811UN.
  3. A USB webcam.
Software (Programming languages and OS  involved): 
  1. HTML.
  2. Linux/Rasbian.
How does it work? The working of the raspberry pi webcam robot is explained as follows. Here, the USB webcam connected to the raspberry pi usually associates with the pi at /dev/video0.A streaming service called mjpg streamer after installation is then used to broadcast the video on the raspberry pi’s local IP address and port number. The resolution of the video and frame rate can be set using linux commands while running the mjpg streamer.The video feed can then be viewed on any local network connected device by opening an HTML file in a browser that invokes the video from raspberry pi’s IPaddress:portno.The file required for this is provided in this tutorial as well. Also, in order to test the webcam locally on the pi, a service called ffmpeg is used. This allows you to view the webcam feed on the monitor connected to the raspberry pi. A block diagram showing the basic working:
Raspberry pi webcam robot DIY Hacking
Block diagram for the raspberry pi webcam robot

Step 1: Connecting the USB webcam and checking it

First of all, you need to check if your webcam is getting detected in the raspberry pi and whether its feed can be viewed. For this, first run the command “lsusb” in the terminal of the raspberry pi. This shows you a list of all the USB connected devices on the pi. Find out, from the list if your webcam’s name or software is displayed (sometimes driver name is displayed eg: Microdia for Iball webcams). Next, we need to check whether you can view the feed from the webcam on the pi. For this use “cd /dev” to go to /dev directory. Next, use “ls” to list its contents, check if “video0” is present.
Raspberry pi webcam robot DIY Hacking
Raspberry pi with USB webcam and wifi dongle
Now, you have to use a service called ffmpeg to view the feed from the webcam on the monitor connected to the pi.Use “sudo apt-get update” to update lists. To install ffmpeg service, use these commands:

sudo apt-get install ffmpeg
ffplay -f video4linux2 -framerate 15 -video_size 320x240 /dev/video0

The first command installs ffmpeg. The second commands starts up the video having a frame rate of 15fps using the video4linux2 mode , having a resolution of 320×240 and using the device from /dev/video0. Once you run it, you will see the webcam feed on the monitor.

Step 2: Setting up the video streaming service for the webcam

Here, the video from the raspberry pi is broadcasted on the local network. Any device on this network can view the video from the webcam.To do this, you need to first install the mjpg streamer. Use these commands to install it:

sudo apt-get install libv4l-dev
sudo apt-get install libjpeg8-dev
sudo apt-get install subversion
sudo apt-get install imagemagick

libv4l-dev and libjpeg8-dev serve as dependencies, imagemagick is used for installation whereas subversion is used for installing open source softwares. In order to download and compile the code use these commands :

svn co https://svn.code.sf.net/p/mjpg-streamer/code/
cd /home/pi/code/mjpg-streamer/
make USE_LIBV4L2=true clean all
sudo make DESTDIR=/usr install

Next, to run this service and to begin streaming use the following command:

mjpg_streamer -i "/usr/lib/input_uvc.so -d /dev/video0 -y -r 640x480 -f 10" -o "/usr/lib/output_http.so -p 8090 -w /var/www/mjpg_streamer"

  • -i input plugin parameters
  • -d represents the video device
  • -y enables YUV format, disables MJPEG
  • -r specifies the resolution
  • -f is the frame rate
  • -o output plugin parameters
  • -p specifies the port number
  • -w specifies output webserving directory
Now your raspberry pi has started broadcasting the video, to view this video download this HTML file : video.html. The contents of it are like this raspberry pi webcam robot (the box in the page will be replaced by your video stream) , but you need to edit the file before using it:

<html> <body> <h1>DIY Hacking - Webcam Robot</h1> <img src="http:192.168.1.9:8090/?action=stream" width="600"> </body> </html>

Here, edit the IPaddress “192.168.1.9” with the IP address of your raspberry pi. You can find it out by using the command “ifconfig” on your pi. Finally use your browser to open this html file and view the video. For devices, like Ipad, etc use any file manager apps to open the HTML file.

Step 3: Building the Raspberry Pi Webcam Robot

If you wish to build your own robot then continue reading. Follow the instructions at : Smartphone controlled robot, to build a smartphone controlled robot and then come back to this tutorial.This is the robot from the tutorial :
Raspberry pi webcam robot DIY Hacking
Smartphone controlled Robot
Now you need to add the raspberry pi and the webcam to this and use a wifi dongle to make it wireless. I have used one half of a plastic box to create a platform for the raspberry pi and webcam. Use hotglue to stick them together. And that’s it, you have made a video streaming, smartphone controlled robot. You can use SSH to initiate the program remotely on the raspberry pi without having the need to hook it up to a monitor.
Raspberry pi webcam robot DIY Hacking
Assembling the raspberry pi webcam robot
The robot can be maneuvered using a smartphone application on an android phone. The corresponding feed from the USB webcam of the raspberry pi can then be streamed using an Ipad, pc or any other device connected to the local wifi network.
Raspberry pi webcam robot DIY Hacking
Parts of the raspberry pi webcam robot
Use this DIY Hacking tutorial to make surveillance systems , video streaming robots or even drones. Real time streaming can be done at minimum delay using the mjpg streamer and is considered as one of the best streaming services. Please note that , it is not required to have an arduino along with pi for building this robot. You can build it by just using the raspberry pi.

Monday, 5 September 2016

Build a WiFi Controlled Robot with the ESP8266


featured copy








Building your own mobile robot is becoming easier and easier, thanks to excellent ready-to-use robotic platforms. A good example of such platform is the Adafruit Mini Robot Chassis kit, which comes with a nice robot chassis, two servo motors with wheels and a support-wheel. This makes it the perfect base for all your mobile robot projects.
On the other hand, you now can buy powerful & cheap microcontrollers like the ESP8266 WiFi chip, which is not only easy to use but also comes with onboard WiFi connectivity. That’s a chip I used a lot on this website to build home automation projects, but it is also the perfect chip to control robots remotely from your computer or mobile device. In this project, we are going to build a WiFi controlled mobile robot based on the ESP8266. At the end of this tutorial, you will be able to control your own mobile robot from your computer, right from your favourite web browser. Let’s dive in!

Hardware & Software Requirements

We are first going to see what components are needed to build this project. First, you will of course need a robot platform. I used the Adafruit mini robot rover chassis as the platform, as it’s really convenient to use along with the ESP8266 WiFi chip. It also comes with servomotors and wheels, so you don’t have to buy those components separately.
Then, to actually control the robot, I used the Adafruit ESP8266 feather board, along with the Adafruit Motor FeatherWing board to control the motors. To connect both boards, I used the Adafruit FeatherWing doubler board.
Finally, to power the robot, I used two different power sources. I used a 3.7V LiPo battery to power the ESP8266 WiFi board, and a 4 x AA battery pack with 1.2V batteries to power the motors of the robot.
This is the list of all the components you will need for this project:
For this project, you will need the latest version of the Arduino IDE, along with the following libraries:
  • aREST
  • Adafruit_MotorShield

Hardware Configuration

Let’s now see how to assemble the robot. The Mini Robot Rover Chassis comes as a kit, so for the basic assembly I will refer you to this excellent guide to assemble the main parts of the robot:
https://learn.adafruit.com/bluefruit-feather-robot/wiring-and-assembly
You can basically follow this guide till you obtain a result like on the following picture:
hw1
So far, you should have the motors and the wheels assembled on the robot, as well as the 3.7V LiPo battery & the FeatherWing doubler mounted on the robot. We are going to use the battery to power the ESP8266 WiFi chip and the Motor FeatherWing, and we’ll use the FeatherWing Doubler to mount all the feather boards on the robot without using a lot of vertical space.
Now, we are going to take care about the motors of the robot. These will be powered by a larger power supply, that can drive the motors faster. First place four batteries (1.2V to 1.5V Alkaline or NimHA AA) into the 4 x AA battery holder, and then connect the battery holder to the motor FeatherWing component just as on the picture:
hw2
You can now place this FeatherWing on the FeatherWing doubler, as well as the ESP8266 Feather HUZZAH board. Also connect the two stepper motors to the motor FeatherWing. This should be the result at this stage:
hw3
Finally, mount the second stage of the robot using the metallic spacers, and also mount the battery holder on top of the robot:
hw4
Note that on the last picture, I already connected the 3.7V LiPo battery to the ESP8266 feather board. However, you can wait until the robot is fully configured to connect the battery.
Congratulations, you just assembled your mobile robot based on the ESP8266 WiFi chip! In the next section, we are going to learn how to configure it so it can receive commands via WiFi.

Configuring the Robot

We now need to configure the ESP8266 WiFi chip on our robot so it can receive commands via WiFi. For that, we are going to use the aREST framework, which is a very convenient way to make the ESP8266 be completely controllable via WiFi.
Let’s now have a look at the code for this project. It stars by including the required libraries:
  1. #include "ESP8266WiFi.h"
  2. #include <aREST.h>
  3. #include <Wire.h>
  4. #include <Adafruit_MotorShield.h>
#include "ESP8266WiFi.h"
#include <aREST.h>
#include <Wire.h>
#include <Adafruit_MotorShield.h>
After that, we create an instance of the Adafruit_MotorShield, and also create instances for the two motors:
  1. Adafruit_MotorShield AFMS = Adafruit_MotorShield();
  2. Adafruit_DCMotor *L_MOTOR = AFMS.getMotor(4);
  3. Adafruit_DCMotor *R_MOTOR = AFMS.getMotor(3);
Adafruit_MotorShield AFMS = Adafruit_MotorShield(); 
Adafruit_DCMotor *L_MOTOR = AFMS.getMotor(4);
Adafruit_DCMotor *R_MOTOR = AFMS.getMotor(3);
We also create an instance of the aREST library:
  1. aREST rest = aREST();
aREST rest = aREST();
You will need to enter your WiFi network name & password inside the sketch:
  1. const char* ssid = "your-wifi-password";
  2. const char* password = "your-wifi-name";
const char* ssid = "your-wifi-password";
const char* password = "your-wifi-name";
We also declare a set of functions to control the robots:
  1. int stop(String message);
  2. int forward(String message);
  3. int right(String message);
  4. int left(String message);
  5. int backward(String message);
int stop(String message);
int forward(String message);
int right(String message);
int left(String message);
int backward(String message);
We also expose all the control functions to the aREST API, so we can call them via WiFi:
  1. rest.function("stop", stop);
  2. rest.function("forward", forward);
  3. rest.function("left", left);
  4. rest.function("right", right);
  5. rest.function("backward", backward);
rest.function("stop", stop);
rest.function("forward", forward);
rest.function("left", left);
rest.function("right", right);
rest.function("backward", backward);
Inside the loop() function of the sketch, we process incoming connections with aREST:
  1. WiFiClient client = server.available();
  2. if (!client) {
  3. return;
  4. }
  5. while(!client.available()){
  6. delay(1);
  7. }
  8. rest.handle(client);
WiFiClient client = server.available();
if (!client) {
  return;
}
while(!client.available()){
  delay(1);
}
rest.handle(client);
Let’s now have a look at the implementation of the functions used to control the robot. For example, this is the function used to make the robot move forward:
  1. int forward(String command) {
  2. // Stop
  3. L_MOTOR->setSpeed(200);
  4. L_MOTOR->run( FORWARD );
  5. R_MOTOR->setSpeed(200);
  6. R_MOTOR->run( FORWARD );
  7. }
int forward(String command) {
  
  // Stop
  L_MOTOR->setSpeed(200);
  L_MOTOR->run( FORWARD );
 
  R_MOTOR->setSpeed(200);
  R_MOTOR->run( FORWARD );
  
}
Note that you can find the complete code inside the GitHub repository of the project:
https://github.com/openhomeautomation/esp8266-robot
It’s now finally time to configure the robot! First, grab all the code from the GitHub repository of the project, and modify it with your own WiFi name and password. Then, upload the code to the ESP8266 feather board. Once that’s done, open the Serial monitor, you should see the IP address of the board. Make sure you copy and paste this IP somewhere, you’ll need it in the next section where we’ll configure the interface to control the robot.

Controlling the Robot via WiFi

We now have a robot that can accept commands via WiFi, but we definitely don’t want to have to type any of commands inside a web browser: it would be way to slow to control a robot! That’s why we are now going to create a nice graphical interface to control the robot.
This interface will be based on aREST.js, a JavaScript library which is very convenient to control aREST-based projects. It will be automatically imported by the interface we are going to create in a moment, so you don’t need to worry about it. If you want to learn more about aREST.js, you can visit the GitHub repository of the library:
https://github.com/marcoschwartz/aREST.js
Let’s first have a look at the HTML file of the interface. Inside thetag, we import all the required files for the interface:
  1. <script type="text/javascript" src="https://cdn.rawgit.com/marcoschwartz/aREST.js/master/aREST.js"></script>
  2. <script type="text/javascript" src="script.js"></script>
<script type="text/javascript" src="https://cdn.rawgit.com/marcoschwartz/aREST.js/master/aREST.js"></script>
<script type="text/javascript" src="script.js"></script>
Now, inside the same file, we define a button for each of the commands of the robot, for example to make the robot go forward:
  1. <div class='row'>
  2. <div class="col-md-5"></div>
  3. <div class="col-md-2">
  4. <button id='forward' class='btn btn-primary btn-block' type="button">Forward</button>
  5. </div>
  6. <div class="col-md-5"></div>
  7. </div>
<div class='row'>

  <div class="col-md-5"></div>
  <div class="col-md-2">
    <button id='forward' class='btn btn-primary btn-block' type="button">Forward</button>
  </div>
  <div class="col-md-5"></div>

</div>
We still need to link the buttons inside the interface to the actual commands of the robot. This will be done in a file called script.js, which will make the link between the graphical interface & the robot.
The file starts by defining the IP address of the robot, and by creating an instance of an aREST device:
  1. var address = "192.168.0.103";
  2. var device = new Device(address);
var address = "192.168.0.103";
var device = new Device(address);
Then, for each of the buttons of the interface, we call the corresponding function on the robot. Also, as we want the buttons to behave like push buttons (meaning whenever the button is released, the robot stops), we also need to call the stop function when the button is released:
  1. $('#forward').mousedown(function() {
  2. device.callFunction("forward");
  3. });
  4. $('#forward').mouseup(function() {
  5. device.callFunction("stop");
  6. });
$('#forward').mousedown(function() {
  device.callFunction("forward");
 });
 $('#forward').mouseup(function() {
   device.callFunction("stop");
 });
It’s now time to finally test our interface and make our robot move around! For that, make sure to edit the script.js file inside the interface folder, and put the actual IP address of your ESP8266 WiFi chip. If that’s not done yet, also disconnect the ESP8266 feather board from USB, and power the ESP8266 using the 3.7V LiPo battery.
Then, open the interface with your favorite web browser. This is what you should see:
Screen Shot 2016-06-16 at 14.04.09
As you can see, there is a button for each function of the robot. You can now try it: whenever you press a button (and keep it pressed), your robot should move immediately!
This is an example of my own mobile robot moving around while I was using the interface:
giv_prev
Congratulations, you built your own mobile robot based on the ESP8266 and controlled it via WiFi! Note that you also control the robot using a mobile device, like a smartphone or tablet, using the exact same interface.

How to Go Further

In this article, we learned how to build a mobile robot based on the ESP8266 WiFi chip, and on the Adafruit Mini Robot Rover Chassis Kit. We first assembled the robot, and configured it so it accepts commands via WiFi. We then controlled the robot using a graphical interface running in your web browser.
There are of course many ways to now improve the project, based on what you learned in this article. You could for example add a distance sensor in front of the robot, and have the measured distance displayed inside the same interface. You could also couple a gyroscope to the robot, and have more complex functions like making the robot turn from a given angle. The possibilities offered by this excellent chassis & the ESP8266 WiFi chip are endless, so don’t hesitate to experiment and have fun!

Thursday, 1 September 2016

I2U2 I am world's simplest telepresence robot.

Who am I




 I2U2 Robot
                                                     


I am a telepresence robot & your companion. I can help you –
a. communicate seamlessly with friends and family through voice and video
b. feel as if you are physically present around them
  
I have complete freedom of movement which include moving and looking around. I can be controlled remotely with your mobile or PC via the internet.




Use me to

Be virtually present at home when you need to connect with your children or parents. Never miss an event or celebration just because you are stuck at a meeting or had to travel
Monitor your home efficiently by moving and looking around entire rooms; avoid blind spots that fixed cameras may have
Why just video conference when you can Interact, play, move around, look around? Find your own unique use case. The possibilities are endless…

How do I work

How I2U2 Works



My features


Features




How do I look

Robot details


The system consists of two parts
  • Controller App Remotely controls the Robot
    • Mobile App for Android
    • iOS & PC Support Coming Soon
    More Info
  • Robot App Acts as the “head” for the
  • Robot hardware
    • Runs on the Android Tablet connected to robot
    • iOS Support Coming Soon
    More Info
  •  



  • Requirements and specifications

    Devices + Apps required

    1. Tablet (Robot device) + I2U2 Robot App
    2. I2U2 Robot
    3. Laptop/Tablet/Smartphone (Controller device) + I2U2 Controller App

    Hardware Specifications

    (Recommended for best experience)

    Robot device – Tablet on Robot

    1. Platform: Android (iOS support from October 2016)
    2. RAM: 2 GB
    3. Screen size: 7 inches or larger
    4. Front Camera: 2 MP
    5. Quad Core processor
    6. Bluetooth support

    Controller device – Smartphone/Tablet

    1. Platform: Android (iOS support from March 2016)
    2. RAM: 2 GB
    3. Screen size: 7 inches or larger
    4. Front Camera: 2 MP
    5. Quad Core processor

    Controller device – Laptop (support from October 2016)

    1. Operating system: Windows, Linux, iOS
    2. Browser: Chrome, Firefox
    3. Ram: 4GB
    4. Front Camera: 2 MP
    5. Quad Core processor

    Other specifications

    Internet Bandwidth 4 Mbps and aboveFor video call
    Color A beautiful combination of white + black
    Robot to tablet Connectivity Bluetooth
    Tablet on robot with remote mobile/web app connectivity Wifi with internet accessIf the devices are connected, blue heart on Robot glows
    Dimensions – Width 35 cm
    Dimensions – Height62 cmFloor to top of the head distance
    Obstacles clearance 3 cmClearance for carpets, small floor items
    Frame Material Powder-coated metalRust and bend proof
    Body casing Material PolypropyleneDesigner bin
    Movement of head vertically 90 degrees up and down120 degrees field of view
    Movement – wheels 360 degreesCan move forward, back, turn left, turn right
    Power 220v/110vEU/US/ Universal
    Backup 8-12 hoursDepends on usage
    Speakers8 OhmKeep it connected to tablet
    Weight 6 kg

    How to setup the Robot & App?

    Robot setup:

    1. Unpack the Robot from the box.
    2. Plugin the charging cable to the Robot. Charge it for atleast 8 hours OR until all the lights in the battery indicator glow.
    3. Press the button to switch on the Robot. On successfully switching on, the battery indicator glows continuously, and the mood light glows “Blue” for a few seconds.
    Switch_large

    Connect the robot to the tablet

    1. Place the ‘Robot device’ in the tablet holder and press the clips until it holds the tablet tight. Once you clip it, try moving/ removing the tablet. It shouldn’t move
    2. Connect the speaker and the micro USB power cable to the tablet. This helps you hear better & charge the tablet even when in use

    App configuration on the tablet

    1. Download & install the I2U2 Robot app from the Google Play Store
    2. Log in to the app with your Gmail account
    3. Set a unique name for the Robot
    4. Add the Gmail address of the users you would like to give access to the robot
    5. Ask these users to follow next three steps mentioned below

    App configuration on the ‘Remote device’

    1. For laptop users: On chrome browser, open i2u2robot.in and login using your Gmail account
    2. For smartphone/ tablet users: Download the I2U2 Remote App from the Google Play store and login to the app using your Gmail account
    3. On your homepage, you should see the Robots that you have an access to remotely

    Connecting the tablet to the Robot

    1. Place the tablet in the tablet holder and press the clips until it holds the tablet tightly in place
    2. Connect the speaker and the power cable to the tablet
    Hurray! The Robot is ready

    Calling the Robot from Remote app

    1. From the Remote app, call the Robot by clicking on the ‘call’ button beside the name of the Robot
     

SITA’s baggage robot lends passengers a hand at Geneva Airport

Location: Geneva, Switzerland
17 May 2016
Leo the robot speeds up bag drop by collecting passengers’ baggage before they enter the terminal
Passengers arriving at Geneva Airport in the past few days have received help with their bag drop from Leo, an innovative baggage robot developed by air transport IT provider SITA, which is being trialed outside the airport’s Terminal 1.
Leo baggage robot at Geneva AirportLeo is a fully autonomous, self-propelling baggage robot that has the capacity to check in, print bag tags and transport up to two suitcases with a maximum weight of 32kg. It also has an obstacle avoidance capability and can navigate in a high-traffic environment such as an airport.
Leo provides a glimpse into the future of baggage handling being explored by SITA Lab and is the first step to automating the baggage process from the moment passengers drop their bags to when they collect them. Using robotics and artificial intelligence, bags will be collected, checked in, transported and loaded onto the correct flight without ever having to enter the terminal building or be directly handled by anyone other than the passengers themselves. 
Leo – named after the Italian Renaissance inventor and engineer Leonardo da Vinci who built what is now recognized as the world’s first robot – comes to the assistance of passengers as they approach the terminal building. Touching Leo’s Scan&Fly bag drop interface opens the baggage compartment doors to allow passengers to place their bags inside. After the passengers have scanned their boarding passes, the tags are printed and can be attached to the bag. With the bags loaded and tagged, the compartment door closes and Leo displays the boarding gate and departure time.
Leo then takes the bags directly to the baggage handling area where they are sorted and connected to the correct flight. The doors of the robot can only be reopened by the operator unloading the baggage in the airport.
The use of robots such as Leo means that in future fewer bags and trolleys will enter the airport terminal, reducing congestion and making airport navigation easier.
Dave Bakker, President Europe at SITA said: “Through the innovative work of the SITA Lab we are able to tackle some of the key challenges that face airports today. Leo demonstrates that robotics hold the key to more effective, secure and smarter baggage handling and is major step towards further automating bag handling in airports. Leo also provides some insight into the potential use of robots across the passenger journey in future.”  
Massimo Gentile, Head of IT at Genève Aéroport, said: “In a busy airport such as Geneva Airport, the use of a robot such as Leo limits the number of bags in the airport terminal, helping us accommodate a growing number of passengers without compromising the airport experience inside the terminal. Leo also proves the case for increased use of robotics to make passengers’ journey a little more comfortable, whether it is checking in baggage, providing directions or helping them through the security process.”
Leo, which was built for SITA by BlueBotics, is part of SITA’s showcase of technology at the 2016 Air Transport IT Summit taking place from 24-26 May in Barcelona.
To watch Leo in action at Geneva Airport click on the following link: www.sita.aero/baggagerobot

Bengaluru international airport to deploy robotic assistants for check-in, security

Unveiling of NAO Evolution: a stronger robot and a more comprehensive operating system

ALDEBARAN is announcing the launch of NAO EVOLUTION, the new generation of its NAO robot, equipped with the NAOqi 2.0™ operating system.    
Aldebaran, the global leader in humanoid robotics, is pleased to announce the launch of NAO EVOLUTION, the 5th and latest generation of NAO, the interactive, autonomous, and fully programmable robot. NAO is already being used for specific research and education purposes. Over 5000 robots are currently operating in 70 countries. With its new functionality, NAO EVOLUTION is the next big step for the development of innovative applications for a broad range of companies and content publishers.
Please click on this link to go to the Multimedia News Release: http://www.multivu.com/mnr/71400559-lancement-de-nao-evolution
NAO EVOLUTION has the most up-to-date operating system, NAOqi 2.0, now being used by all of the company's robots, including the emotional robot Pepper. Designed for easy natural voice interaction, the NAOqi environment includes a dialogue engine, an emotional engine, and Autonomous Life- the system giving NAO. All robots designed by Aldebaran share the same technologies and operating system, enabling the transfer applications from one robot to another with a minimum amount of adaptation. In this sense, Aldebaran has succeeded in the challenge of creating a unique "platform" available in several humanoid forms and benefiting from the same advanced software.
NAO EVOLUTION also has improved functionality for easy interaction between humans and the robot and allowing developers to program complex sequences themselves:
·         Improved shape and facial detection and recognition using new algorithms
·         Improved sound source location using 4 directional microphones
·         Refined obstacle detection and distance estimation with a detection range from 1 cm (0.39 inches) to 3 m (9.8 feet) using new sonar telemeters
·         More powerful battery: 48.6 watt-hours, giving NAO EVOLUTION about 1 h 30 min of battery life in operational mode, i.e. 30% more than the previous generation.
Although NAO EVOLUTION is the same size as before (height: 58 cm, 1.9 feet), it is now:
·         Stronger, with metal gears in the neck, hips, legs, and ankles.
·         Quieter, with soles that dampen the noise and friction of its footsteps.
·         More skillful, grasping objects more easily using finger grips.
 
"This new phase is fully in line with our vision of eventually offering the greatest possible number of humanoid robots for a variety of purposes" says Bruno Maisonnier, founder and CEO of Aldebaran. "Apart from schools and universities, we would like companies and the developer community to get hold of our robots and create applications that will ensure the success of NAO in the future."
Now available, NAO EVOLUTION will be marketed with package per region.
These packages include a full range of software and services and 1 to 3 years of after-sales servicing and warranties.
Aldebaran, the names and logos of NAO, Pepper and NAOqi are designated trademarks of Aldebaran in France and in other countries.
 
About Aldebaran
Established in 2005 by Bruno Maisonnier and with offices in France, China, Japan, and the United States, Aldebaran designs, manufactures and sells autonomous humanoid robots to contribute to human well-being. More than 5,000 NAOs, our first product, are used today as a research and education platform in 70 countries around the world. Aldebaran has 500 employees involved in robot development and manufacture. Aldebaran is a SoftBank Group company, holding 78.5% of its own capital.
 
 
                           
Robot
Bengaluru’s Kempegowda International Airport will deploy around 10 NAO humanoid robots developed by Aldebaran Robotics.
Humanoid robots developed by Aldebaran Robotics, a French robotics company headquartered in Paris and owned by Japanese conglomerate Softbank, will soon be deployed at the international airport in Bengaluru to assist airport staff.
Bengaluru International Airport Ltd (BIAL), the public-private consortium that operates the Kempegowda International Airport (KIA), has not made an official announcement about this, and a spokesperson said “it was too premature to talk about.”
However, sources at KIA confirmed that the robots are being tested and around 10 of them will be used at various points inside the international airport to process check-ins and in assisting Central Industrial Security Force (CISF) personnel in scanning passengers’ baggages and boarding passes.
The robots belong to the NAO family of autonomous, programmable humanoid robots developed by Aldebaran, which was acquired by the Softbank Group in 2013 for $100 million. The company has released five versions of its line of programmable, voice-, speech- and facial- recognizing, small-scale humanoid robots since its launch in 2006.
NAO, Aldebaran’s first humanoid robot, is 58 cm (around 2 ft) in height and is currently in its 5th iteration. The company has sold almost 10,000 NAOs throughout the world, in areas such as education, entertainment, hospitality, travel and sports. Each NAO robot is said to cost around Rs 10 lakh in India.
nao1
At Tokyo’s Narita International Airport, a NAO robotic clerk that speaks multiple languages was first deployed at the store of Mitsubishi Tokyo UFJ, a major Japanese retail bank, but has now moved to other areas of the airport to greet flyers who don’t speak Japanese. A fully autonomous, self-propelling baggage robot called Leo at Geneva airport meets flyers outside the terminal, scans their boarding passes and takes luggage away after issuing passengers a printed receipt. The robot is being run by air transport IT provider SITA, and was developed by BlueBotics SA, a Swiss company.
The humanoid robot is equipped with tactile sensors, ultrasonic sensors, a Gyro, an accelerometer, force sensors, infrared sensors, HD cameras, four microphones and high accuracy digital encoders on each joint. It is controlled by a specialised Linux-based operating system, named NAOqi.
NAO is said to be an endearing, interactive and personalizable robot companion. It is highly customisable for various environments and applications. “Everyone can construct his own experience with specific applications based on his own imagination and needs,” says the company’s website. Aldebaran has also developed two other humanoid robots, Pepper and Romeo, since developing and retailing NAO.
In India, NAO has reportedly been used by the Indian Institute of Technology, Kanpur (FactorDaily could not independently verify this).

How To Install Ubuntu Linux Alongside Windows 10 (UEFI



How To Install Ubuntu Linux Alongside Windows 10 (UEFI)
 

Introduction

Updated For Ubuntu 16.04

Windows 10 has been out for a while now and as I have a track record for writing dual boot guides I thought it was about time I created a Windows 10 and Ubuntu dual boot guide. 

This guide focuses on computers with a Unified Extensible Firmware Interface (UEFI) over a standard Basic Input Output System (BIOS). Another guide will be coming out shortly to cover the BIOS version.

What this means is that if you were using Windows 8 and 8.1 before upgrading to Windows 10 then this guide will work for you. If you have just bought a brand new Windows 10 machine and it has a standard hard drive (i.e it isn't a Surface Pro) then this guide will also work for you.

If your computer used to run Windows 7 before upgrading to Windows 10 then it is highly probable that this guide isn't suitable in which case you should follow this guide

How can you tell if your computer has a UEFI over a standard BIOS? 






















In the search box at the bottom of the screen type "System Information" and when the icon appears at the top click on it.

Halfway down the right panel there is an item called BIOS mode. If it says UEFI then this guide will work for you.

Steps For Dual Booting Windows 10 And Ubuntu

The steps required for dual booting Windows 10 and Ubuntu are as follows:

  • Backup your Windows 10 operating system (optional but highly recommended)
  • Create a Ubuntu USB drive
  • Enable booting from a USB drive
  • Shrink the Windows 10 partition to make space for Ubuntu
  • Boot into Ubuntu live environment and install Ubuntu
  • Amend the boot order to make sure Ubuntu can boot
I have written another guide which shows how to dual boot Ubuntu and Windows 10 on a computer with an SSD. This is largely experimental as it is my first time of doing it but it does work for me and it might give you some ideas when partitioning your SSD.

Back Up Windows 10

In the list of steps above I have put this down as an optional requirement but I can't stress highly enough that you should really do it.

Let's imagine for a moment that you have a machine that used to run Windows 8 and you spent the time upgrading to Windows 10.

If you follow this process and for whatever reason it doesn't work and your machine is left in an undesirable state then without a backup the minimum it will cost you is the time it takes to reinstall Windows 8 and then upgrade to Windows 10.

Imagine now that you don't have the Windows 8 media and you don't have a viable recovery partition. You now have no way of getting Windows back without buying either the Windows 8 disk which costs around £90 or a Windows 10 disk which costs £199. You would also have to find and download any required graphics, audio and other drivers required for Windows to run properly.

I have written a guide (linked below) which shows you how to backup all of your partitions using a tool called Macrium Reflect. There is a free version of the tool available and the most this tutorial will cost you is time and if you don't have one an external hard drive or a spindle of blank DVDs.




Create A Ubuntu USB Drive

There are many tools out there for creating a Ubuntu USB drive including UNetbootin, Universal USB Creator, YUMI, Win32 Disk Imager and Rufus.

Personally the tool that I find most useful for creating Linux USB drives is Win32 Disk Imager. 

I have written a guide showing how to create a Ubuntu USB drive. 


It shows you how to do the following things:
  • How to get Win32 Disk Imager, 
  • How to install Win32 Disk Imager.
  • How to format a USB drive.
  • How to create a Ubuntu USB drive
  • How to set the power options in Windows 10 to allow booting from USB
  • How to boot into a Ubuntu live environment
You will obviously need a USB drive for this purpose.

Click here for a guide showing how to create a Ubuntu USB drive.


If you would prefer to, you can buy a USB drive with Ubuntu already installed on it.

If you want to get the USB drive back to normal after installing Ubuntu follow this guide which shows how to fix a USB drive after Linux has been installed on it.

Shrink Windows To Make Space For Ubuntu

If your computer only has one hard drive you will need to shrink your Windows 10 partition in order to make space for Ubuntu.

Click here for a guide showing how to shrink your Windows 10 partition.

Boot Into Ubuntu Live Environment

Make sure that the Ubuntu USB drive is plugged into the computer.

If you backed up your computer using Macrium and you chose to create the Macrium boot menu option then you can simply reboot your computer. 

When the above screen appears click on the "Change defaults or choose other options" link at the bottom of the screen.

If you chose not to create the Macrium boot menu option boot into Windows, insert the Ubuntu USB drive, hold down the shift key and reboot your computer. (Keep the shift key held down until a screen similar to the one below appears).


Each manufacturer has a different version of UEFI and so the menu options may be different.

The important thing is that a blue screen with white writing appears.

You are basically looking for the option to boot from the USB drive and this may take some finding. From the image above I chose the “Choose other options” menu item which produced the screen below.



I then clicked on the “Use a device” option which as you can see has the subtext “Use a USB drive, network connection or Windows recovery DVD”.


A list of devices will now appear.

This isn’t the first time I have installed things on this computer and my EFI partition still has links to old Ubuntu versions.

The important link on this screen is the “EFI USB Device” option.

Choose the EFI USB Device option and Ubuntu should now boot from the USB drive.

A boot menu will appear.

Choose the first menu option to try Ubuntu.

A large dialogue window will appear with options to install Ubuntu or to Try Ubuntu.

Click on the “Try Ubuntu” option. Ubuntu will now be loaded as a live session. You can try out all of the features of Ubuntu but if you reboot all the changes will be lost.

Install Ubuntu

To start the installation click on the “Install Ubuntu” icon on the desktop.

After clicking on the “Install Ubuntu” option the following screen will appear:






















This is the beginning of the installation process and you can select the language which is used to help you through the process.

Choose your language and click “Continue”.






















The installer has changed a little bit for Ubuntu 16.04. The pre-requisites screen has been removed as has the option to connect to a wireless network prior to installing.

The preparing to install Ubuntu screen now simply lists the option to download updates (which is only available after you have an internet connection) and the option to install third party software for playing MP3 audio and watching Flash.

If you have a decent internet connection then you might wish to install updates during the installation.

To connect to the internet click on the network icon in the top right corner and a list of wireless networks will be listed. Click on the network you wish to connect to and enter the security key when prompted.

You will need to click the back button on the "preparing to install Ubuntu" screen and then click continue again when you are back at the welcome screen.

If you have a poor internet connection then I would choose not to connect to the internet. You can update your system after it has been installed.

You can choose to install the third party tools for playing MP3 audio as part of the installation process now by checking the box or you can do it after the system has been installed.

Click "Continue".



























The “Installation Type” screen lets you decide whether you want to install Ubuntu alongside Windows or over the top.

Choose the “Install Ubuntu alongside Windows Boot Manager” option.

Click “Install Now”.


A window will appear showing you what is going to happen to your disk. By default the Ubuntu installer will create an ext4 partition for Ubuntu and all of your personal files and a swap partition used for swapping idle processes when memory gets low.

Click “Continue”.


























A map of the world will appear. The purpose of it is to make it possible for Ubuntu to set the time on your clock correctly.

Click where you live on the map or start typing it into the box provided and then click “Continue”.



























Almost there. Just two more steps before Ubuntu is installed.

You now need to choose your keyboard layout. Select your keyboard’s language in the left pane and then the actual physical layout in the right pane.

Alternatively click on the detect keyboard layout option and it will more than likely do it for you.

Test out the keyboard layout that you have chosen by typing into the box provided. Specifically try out symbols such as the dollar sign, pound symbol, hash tags, speech marks, slashes and other special characters as these are the keys that tend to move around on a keyboard.

Click “Continue”.



























The final step is to create a default user.

Enter your name and give your computer a name.

Enter a username into the box provided and choose a password and repeat it.

Click on the “Require my password to log in” option. I don’t really recommend anyone letting their machine log in automatically unless it is a virtual machine used for test purposes.

Finally click “Continue”.





















The files will now be copied to your computer.






When the process has finished you will have the options to continue test or to restart now. 

Choose the “Continue Testing” option.

Change The Boot Order So That Ubuntu Can Boot

You will need to be connected to the internet for this to work.

Click on the network icon in the top right corner and choose your wireless network (unless you are connected with an Ethernet cable). Enter the security key.

Open up a terminal window by either pressing CTRL ALT and T at the same time or click the top icon in the bar on the left side and type “term” into the search box.  Click on the terminal icon that appears.






















Type sudo apt-get install efibootmgr into the terminal window.

When asked whether you want to continue press “y”.

























After the installation has completed type efibootmgr into the terminal window.

A list of boot devices will appear.

As you can see in my list there are the following boot options:
  • boot0000 for Ubuntu (this is an old version and can be ignored)
  • boot0001 which is Windows
  • boot0002 and boot0003 are two LAN devices
  • boot0004 which is the new version of Ubuntu that I just installed
  • boot0005 is my USB device
  • boot0006 and boot0007 are two other LAN devices
  • boot0008 is another USB device

At the top of the text you will see that my current boot device is boot0005 which is the USB device.

More importantly is the boot order which is listed as 0001,0000,0004,2001. 

What this tells me is that the computer will boot Windows first, then the rubbish version of Ubuntu, then the new version of Ubuntu and finally a USB device.

This is clearly incorrect.

























To change the bootorder all you have to do is enter the following command:

·         sudo efibootmgr –o 4,1

The –o says that I want to change the order. Then all I have to do is list the order I want things to boot.

So in the command above I have stated I want Ubuntu to boot first and then Windows.

Type exit into the window and reboot the computer by clicking the icon in the top far right corner of the screen.

Choose to shutdown and reboot your computer.

When given the option and before the computer actually reboots remove the USB drive. 

Now when you restart your computer a menu will appear with options for booting into Ubuntu and Windows 10.

Try them both out and hopefully you will have successfully installed Ubuntu alongside Windows 10.

What Next

The Antidote

If you have followed this guide and you want Windows back the way it was before you installed Ubuntu follow this guide:

Troubleshooting

If Ubuntu still will not boot after running EFI Boot Manager try reading this guide which aims to help with UEFI boot issues.