Robotics: Today & Tomorrow - Page 5 - SkyscraperCity
 

forums map | news magazine | posting guidelines

Go Back   SkyscraperCity > Fun Forums > Space, Science & Technology

Space, Science & Technology shaping tomorrow's world


Global Announcement

As a general reminder, please respect others and respect copyrights. Go here to familiarize yourself with our posting policy.


Reply

 
Thread Tools
Old July 5th, 2013, 06:41 PM   #81
Atmosphere
Live from the sky!
 
Atmosphere's Avatar
 
Join Date: Mar 2009
Location: Amsterdam / Seoul
Posts: 3,049
Likes (Received): 1042



We still need to solve the power supply issue. Currently al those humanoid bots need a cord for power.

But maybe for indoor applications like factories, it could be done with induction via the floor. At least then robots would be 'free'. (I can't see 20 robots running around with extension cords behind them)
__________________
Build it
Atmosphere no está en línea   Reply With Quote
Sponsored Links
Advertisement
 
Old July 22nd, 2013, 02:03 PM   #82
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

Self-assembling multi-copter demonstrates networked flight control



Researchers at ETH Zurich have demonstrated an amazing capability for small robots to self-assemble and take to the air as a multi-rotor helicopter. Maximilian Kriegleder and Raymond Oung worked with Professor Raffaello D’Andrea at his research lab to develop the small hexagonal pods that assemble into flying rafts.

The true accomplishment of this research is that there is not one robot in control – each unit in itself decides what actions to take to keep the group in the air in what's known as Distributed Flight Array.







SOURCE: http://www.gizmag.com/distributed-fl...icopter/28380/
__________________

NanoMini liked this post
Ulpia-Serdica no está en línea   Reply With Quote
Old July 22nd, 2013, 08:38 PM   #83
NanoMini
Wat's price of the earth?
 
NanoMini's Avatar
 
Join Date: Apr 2013
Posts: 2,157
Likes (Received): 2534

Wow, how intelligent they are !
NanoMini no está en línea   Reply With Quote
Sponsored Links
Advertisement
 
Old July 22nd, 2013, 10:56 PM   #84
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

The LearningGripper from Festo looks like an abstract form of the human hand. The four fingers of the gripper are driven by twelve pneumatic bellows actuators with low-level pressurisation. Thanks to the machine learning process, the gripper is able to teach itself to carry out complex actions such as the targeted gripping and positioning of an object.



http://www.festo.com/cms/en_corp/13156.htm
Ulpia-Serdica no está en línea   Reply With Quote
Old July 23rd, 2013, 12:48 AM   #85
Atmosphere
Live from the sky!
 
Atmosphere's Avatar
 
Join Date: Mar 2009
Location: Amsterdam / Seoul
Posts: 3,049
Likes (Received): 1042

Quote:
Originally Posted by Ulpia-Serdica View Post
Self-assembling multi-copter demonstrates networked flight control



SOURCE: http://www.gizmag.com/distributed-fl...icopter/28380/
I wonder how big the group can get. Is it possible to connect a few hundred of those things together and still be able to fly?
__________________
Build it
Atmosphere no está en línea   Reply With Quote
Old August 1st, 2013, 11:51 PM   #86
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

IIT's HyQ quadruped robot gets better reflexes



the HyQ hydraulically-actuated quadruped robot can walk, trot, kick, and jump, but its reflexes need an upgrade before it can move from flat ground to more challenging terrain. To that end, researchers from the Italian Institute of Technology's (IIT) have developed an animal-like step reflex algorithm that quickly detects when the robot's feet run into obstacles, preventing trips and falls.



Legged robots like IIT's HyQ were designed to go where other types of robots cannot, but they're not much use if they trip over small obstacles like fallen logs or concrete steps. Robots such as the LS3 from Boston Dynamics typically use a sensor head equipped with LIDAR and cameras to detect major obstacles ahead of time to plan the safest route.



HyQ's sensor head is still in the works, so for now it essentially feels its way forward. HyQ's legs are compliant, allowing the robot to detect and absorb external forces acting on its legs. The step reflex is triggered if the robot detects that the foot motion is obstructed, and causes it to automatically lift that leg over and above the obstruction.



The robot is capable of running at speeds up to 2 meters per second (4.4 mph), so this has to happen within a fraction of a second for it to work. Even then, it doesn't always catch something, similar to when a person steps on something unexpectedly. In that case, it initiates the step reflex the next time it moves its foot. It's good enough to bypass obstacles up to 11 cm (4.3 in) high.



Currently the research team at IIT's Dynamic Legged Systems Lab is working on HyQ's vision system and dynamic gaits. The lab is also toying with the idea of adding a pair of arms to the robot, in a centaur-like configuration, which would allow it to interact with objects in its environment. In the mean time, they'd like to begin testing its reflexes in a wooded area later this year.



They're not alone, having recently sold one of the robots to ETH Zurich's Agile and Dexterous Robotics Lab. With similar ambitions, the two labs are working together to accelerate the development of legged robots. And today IIT researchers will be displaying the robot publicly at London's Natural History Museum as part of the Living Machines conference.

SOURCE: http://www.gizmag.com/iit-hyq-quadru...eflexes/28545/
Ulpia-Serdica no está en línea   Reply With Quote
Old September 22nd, 2013, 03:31 PM   #87
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

Robot barman knows when you want a drink



Meet James. He’s a barman with a cheery disposition, is quick with your order, and doesn't tolerate queue jumping. He’s also a one-armed robot with a tablet for a head. But the really curious thing about James is that he can read your body language to find out whether or not you want to order a drink.



The Joint Action in Multimodal Embodied Systems (James) robot is an EU-funded project that started in 2011. As part of the project, Professor Dr. Jan de Ruiter of the Psycholinguistics Research Group at Germany's Bielefeld University along with partners Foundation for Research and Technology-Hellas in Crete, Fortiss in Munich, and the University of Edinburgh set out to solve the problem of how to employ robots as bartenders in a manner that humans would readily accept.

There have been any number of robot bartenders built in recent years. Many have some cool moves, but ordering drinks from one often involves a bit of a learning curve as the patron figures out how to place an order using a touchscreen or smartphone. Unfortunately, pub goers tend to be a bit single minded about getting their hands on a pint and don’t like complications.

The problem of robot bartenders is simple: Robots don't like the real world. They like things to be tidy, orderly, and predictable – preferably with optical codes printed on everything. However, a pub is about as real as the world gets. It's crowded, noisy, dimly lit, with music and conversation everywhere.

It's relatively easy to make a robot that can mix drinks. It's another matter how to tell the robot what you want to drink. And it's another order of magnitude for the robot to figure out whether or not you want a drink in the first place, and another again to get it to do so in a pub.

Patrons don't like dealing with touch screens or other interfaces. What they want is a robot that really can replace a bartender, so as the drink ordering process doesn't change as they swap over. The trouble is, bar staff are very good at cutting through all the confusion and finding out who wants to order a drink and who doesn't. What is more remarkable is that they do so using cues that neither they or the patrons are consciously aware of.

Bielefeld University’s contribution to the James project was to study how people order drinks and program that knowledge into the robot. "Currently, we are working on the robot’s ability to recognize when a customer is bidding for its attention," says De Ruiter. "Thus, we have studied the process of ordering a drink in real life."

For James to be successful, he has to be able to serve people who have never met him and know nothing about how he works. That puts all the pressure on James to get things right. "In order to respond appropriately to its customers the robot must be able to recognize human social behavior," says de Ruiter.

It turns out that it's more important for the robot to understand body language than just what's spoken to it. This was discovered when the team took video camera to pubs and clubs in Bielefeld and Herford in Germany, and Edinburgh in Scotland, and recorded people ordering drinks at the bar. Later, the videos and snapshots from them were shown to experiment participants, who had to sort them as to which showed someone ordering a drink and someone who wasn't.

The results were rather surprising. When questioned, people said that when they wanted to order a drink they looked at their wallet, held up bank notes, or waved. It turned out that most people actually did none of these things or very rarely. For example, only 1 in 25 waved. Instead, 90 percent stood quietly perpendicular to the bar and looked at the bartender. If they didn’t want to order a drink, they adopted a different stance, such as turning slightly away from the bar or chatting with the person next to them.

"Effectively, the customers identify themselves as ordering and non-ordering people through their behavior," says psychologist Dr. Sebastian Loth. When asked in a BBC interview how people learned this ordering behavior, de Ruiter said that it was entirely natural and "like learning to breathe."

The team established that James can determine a patron’s posture, movements and actions almost in real time. The next step was to reprogram James to take into account the new data. He had to be programmed to not offend patrons by either mistakenly asking them if they wanted a drink or ignoring someone who wanted to order. The later, the team says, is worse. This meant giving James a clear definitions of when someone is ordering or not and to be able to use these definitions based on the social context.

The James project continues until January. Whether or not the team will be able to program James to discuss the football match last night remains to be seen.

SOURCE: http://www.gizmag.com/james-robot-bartender/29118/
Ulpia-Serdica no está en línea   Reply With Quote
Old September 26th, 2013, 01:19 AM   #88
Сталин
BANNED
 
Join Date: Dec 2011
Posts: 1,776
Likes (Received): 1124

QF-16 fighters without pilots. They can practically be robots without the two pilots monitoring it from the ground. This is a step forward for a robotic military.



Сталин no está en línea   Reply With Quote
Old September 28th, 2013, 03:20 AM   #89
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

HUMAN ROBOT GETTING CLOSER



What if a robot could act and even feel like a human? Researchers are getting a bit closer to accomplishing that feat. They're working on a robot that feels, sees, thinks and learns like us as the lines between man and machine become a bit more blurry.



A robot that feels, sees and, in particular, thinks and learns like us. It still seems like science fiction, but if it's up to UT researcher Frank van der Velde, it won't be. In his work he wants to implement the cognitive process of the human brain in robots. The research should lead to the arrival of the latest version of the iCub robot in Twente. This human robot (humanoid) blurs the boundaries between robot and human.

Decades of scientific research into cognitive psychology and the brain have given us knowledge about language, memory, motor skills and perception. We can now use that knowledge in robots, but Frank van der Velde’s research goes even further. “The application of cognition in technical systems should also mean that the robot learns from its experiences and the actions it performs. A simple example: a robot that spills too much when pouring a cup of coffee can then learn how it should be done.”

Possible first iCub in the Netherlands

The arrival of the iCub robot at the University of Twente should signify the next step in this research. Van der Velde submitted an application together with other UT researchers Stefano Stramigioli, Vanessa Evers, Dirk Heylen and Richard van Wezel, all active in the robotics and cognitive research. At the moment, twenty European laboratories have an iCub, which was developed in Italy (thanks to a European FP7 grant for the IIT). The Netherlands is still missing from the list. Moreover, a newer version is currently being developed, with for example haptic sensors. In February it will be announced whether the robotics club will actually bring the latest iCub to the UT. The robot costs a quarter of a million Euros and NWO (Netherlands Organisation for Scientific Research) will reimburse 75% of the costs. Then the TNO (Netherlands Organisation for Applied Scientific Research) and the universities of Groningen, Nijmegen, Delft and Eindhoven can also make use of it. Within the UT, the iCub can be deployed in different laboratories thanks to a special transport system.

Robot guide dog

“The possibilities are endless, realises Van der Velde. “The new iCub has a skin and fingers that have a much better sense of touch and can feel strength. That makes interaction with humans much more natural. We want to ensure that this robot continues to learn and understands how people function. This research ensures, for example, that robots actually gather knowledge by focusing on certain objects or persons. In areas of application like healthcare and nursing, such robots can play an important role. A good example would be that in ten years’ time you see a blind person walking with a robot guide dog.”

Nano-neural circuits

A recent line of research that is in line with this profile is the development of electronic circuits that resemble a web of neurons in the human brain. Contacts have already been made to start this research in Twente. In the iCub robot, this can for example be used for the robot’s visual perception. This requires a lot of relatively simple operations that must all be performed in parallel. This takes a lot of time and energy in the current systems. With electronic circuits in the form of a web of nerve cells this is much easier.

““These connections are only possible at the nanoscale, that is to say the scale at which the material is only a few atoms thick. In combination with the iCub robot, it can be investigated how the experiences of the robot are recorded in such materials and how the robot is controlled by nano-neural circuitry. The bottleneck of the existing technical systems is often the energy consumption and the size. The limits of Moore's Law, the proposition that the number of transistors in a circuit doubles every two years through technological advances, are reached. In this area we are therefore also on the verge of many new applications.”

Frank van der Velde

Frank van der Velde has the Technical Cognition chair within the Department of Cognitive Psychology and Ergonomics of the Faculty of Behavioural Sciences. He is affiliated with the research institute CTIT and participates in the European ConCreTe (Concept Creation Technology) project. In the middle of this month, he delivered his inaugural lecture for the position of professor at the University of Twente.

Van der Velde has long had a fascination for cognition and technical systems. He refers to robots as ‘he’ or ‘she’. The example of a robot guide dog comes from his experiences with blindness in his personal circle of acquaintances. He is not afraid that robots will eventually dominate humanity. “It will never go that far. The pouring of coffee that I was talking about; let's first make sure that it will no longer makes a mess.”

SOURCE: http://www.utwente.nl/en/archive/201...g-closer.docx/
Ulpia-Serdica no está en línea   Reply With Quote
Old October 3rd, 2013, 01:25 AM   #90
vonbingen
BANNED
 
Join Date: Feb 2013
Posts: 3,966
Likes (Received): 2595


World's Top3 Humanoid Robots - Asimo vs HPR-4 vs NAO!

1: Honda ASIMO. JAPAN.

Astronaut look alike Robot ASIMO by Honda probably is the most famous Humanoid robot till now.It is the first ever robot to walk,move and even climb stairs like humans.ASIMO is 4 Ft 3 inch tall and weighs 53 kilograms. One could call it the foundation for the future generation of robotics.It took over 15 years of extensive research to get it in this shape.

No. 2: Kawada HRP-4. JAPAN.

A slim,fast and more advanced robot by the Japanese.

No. 3: Aldebaran Nao. FRANCE.

One of the cutest and most intelligent robots,the Aldebaran Nao can behave on its own and can always be programmed to do more.




Aldebaran Robotics Nao Robot Show in France Pavilion Shanghai Expo 2010.

Here's a sneak peek at the soon to be famous dancing Nao robots, the technological mascot of the France Pavilion.
The robots are "rehearsing" before their first public appearance on June 21, on the day of France Pavilion Day, which coincides with Music Day in France.
The performance showcases Nao's range of smooth yet agile and rhythmic movements to a 3-part music compilation including the famous orchestral masterpiece Bolero by French composer Maurice Ravel.
This also marks the first time robots have supported an artistic field evoking emotions.
Nao is a humanoid, autonomous, interactive and completely programmable robot created by Aldebaran Robotics (www.aldebaran-robotics.com/en), the worldwide leader in humanoid robotics.
__________________

Сталин liked this post
vonbingen no está en línea   Reply With Quote
Old October 5th, 2013, 12:55 AM   #91
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661



ROSA™ robotic device has been designed to increase the safety and reliability of various Neurological procedures without compromising established surgical protocols. ROSA™ is an integrated multi-application platform that offers a reliable and accurate surgical assistant.



Comparable to a "GPS" for the brain, the robot can be used for any type of cranial procedures requiring surgical planning with preoperative data, patient’s registration and precise positioning and handling of instruments.

To date, ROSA™ is the only robotic Assistant approved for Neurosurgical procedures in clinical use in Europe, the United States and in Canada and with the following specifications:

A robotic arm with six degrees of freedom



ROSA™ has a robotic arm with six degrees of freedom whose architecture replicates the movements of a human arm. This provides a very high dexterity in achieving complex surgical procedures as well as allowing for a complete freedom in the choice of trajectories.

Advanced haptic capability

ROSA™ has an advanced haptic capability that gives the Neurosurgeon the ability to easily guide the instruments manually and within established limits and restrictions established during in the planning stage. The surgeon is able to easily interact with the robot without changing any of his surgical techniques thus fully harvesting the benefits of a robotic movement.

A non-invasive and touch free registration system



ROSA™ is equipped with a patented registration system that uniquely combines precise robotic movement with a non-invasive laser measurement for the patient registration. This method allows the surgery to be done without the use of invasive markers or frame.

ROSA™ ‘s inherent flexibility empowers the surgeon in a broad range of indications, including: biopsies, implantations of electrodes for functional procedures (stimulation of the cerebral cortex, deep brain stimulation), open skull procedures surgery using navigation, Ventricular and Transnasal endoscopy as well as other Keyhole procedures.

SOURCE: http://medtech.fr/en/rosa1
Ulpia-Serdica no está en línea   Reply With Quote
Old October 5th, 2013, 01:01 AM   #92
Guajiro1
Cereal Killer
 
Guajiro1's Avatar
 
Join Date: Dec 2012
Location: Buenos Aires
Posts: 2,580
Likes (Received): 6803

Quote:
Originally Posted by Ulpia-Serdica View Post
IIT's HyQ quadruped robot gets better reflexes



the HyQ hydraulically-actuated quadruped robot can walk, trot, kick, and jump, but its reflexes need an upgrade before it can move from flat ground to more challenging terrain. To that end, researchers from the Italian Institute of Technology's (IIT) have developed an animal-like step reflex algorithm that quickly detects when the robot's feet run into obstacles, preventing trips and falls.



Legged robots like IIT's HyQ were designed to go where other types of robots cannot, but they're not much use if they trip over small obstacles like fallen logs or concrete steps. Robots such as the LS3 from Boston Dynamics typically use a sensor head equipped with LIDAR and cameras to detect major obstacles ahead of time to plan the safest route.



HyQ's sensor head is still in the works, so for now it essentially feels its way forward. HyQ's legs are compliant, allowing the robot to detect and absorb external forces acting on its legs. The step reflex is triggered if the robot detects that the foot motion is obstructed, and causes it to automatically lift that leg over and above the obstruction.



The robot is capable of running at speeds up to 2 meters per second (4.4 mph), so this has to happen within a fraction of a second for it to work. Even then, it doesn't always catch something, similar to when a person steps on something unexpectedly. In that case, it initiates the step reflex the next time it moves its foot. It's good enough to bypass obstacles up to 11 cm (4.3 in) high.



Currently the research team at IIT's Dynamic Legged Systems Lab is working on HyQ's vision system and dynamic gaits. The lab is also toying with the idea of adding a pair of arms to the robot, in a centaur-like configuration, which would allow it to interact with objects in its environment. In the mean time, they'd like to begin testing its reflexes in a wooded area later this year.



They're not alone, having recently sold one of the robots to ETH Zurich's Agile and Dexterous Robotics Lab. With similar ambitions, the two labs are working together to accelerate the development of legged robots. And today IIT researchers will be displaying the robot publicly at London's Natural History Museum as part of the Living Machines conference.

SOURCE: http://www.gizmag.com/iit-hyq-quadru...eflexes/28545/
That's now, but in the future...

Guajiro1 no está en línea   Reply With Quote
Old October 5th, 2013, 01:12 AM   #93
Atmosphere
Live from the sky!
 
Atmosphere's Avatar
 
Join Date: Mar 2009
Location: Amsterdam / Seoul
Posts: 3,049
Likes (Received): 1042

Incredible amount of updates from Boston Dynamics robots today

Wildcat


Atlas update


BigDog Update:
__________________
Build it
Atmosphere no está en línea   Reply With Quote
Old October 13th, 2013, 02:00 PM   #94
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

Back in June, I posted this article about DFKI's iStruct. It seems that this robot has went through a bipedal transformation.

iStruct robot ape evolves upright as robo-spine research eyes space



What took humans millions of years to achieve, Germany’s robotic ape has done in moments, with a research project into artificial spines showing how a quadrupedal ‘bot can evolve into a bipedal one. Three months ago, DFKI’s iStruct robo-ape was wandering around on all-fours as engineers experimented with human-inspired skeletal systems for potential space exploration. Now, it’s able to stand up and move around on two legs.



As DFKI sees it, traditional robots struggle with moving around non-linear environments because they have feet with a single point of contact and very little movement within their bodies. Instead, the iStruct prototype has multi-point-contact feet, which pair almost fifty sensors – including pressure, angle, distance, and even temperature – with multiple motors for several degrees of freedom, allowing the ape to adjust its stance according to the nature of the surface it’s standing on.

However, the robot also swaps the solid upper body found in something like Honda’s ASIMO for an adjustable one, mimicking the human spinal column. Now articulated with six degrees of freedom, the spine can not only shift the upper body for better balance, but function as a torque sensor too for real-time feedback.

The result, DFKI has found, is a robot that can not only lumber around on all-fours with a high degree of stability, but stand upright and balance more readily even when the upper arms are extended or moved. iStruct can shift its center of mass based on live data about the surrounding environment.

It’s impressive stuff, though we’ll admit to some misgivings about the robo-ape’s hands: they may look like paws when it’s on all-fours, but they start to look a whole lot more like culling hooks when it’s standing up. Exactly when iStruct might start exploring distant environments is unclear at this stage, though the team at DFKI will be showing it off at the Intelligent Robots and Systems conference in Tokyo in early November.

SOURCE: http://www.slashgear.com/istruct-rob...pace-11301045/
Ulpia-Serdica no está en línea   Reply With Quote
Old October 23rd, 2013, 01:07 AM   #95
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

Poppy, a 3D-printed humanoid robot that defies conventions



A new 3D-printed robot called Poppy is helping a team of French researchers study bipedal walking and human-robot interaction. They were able to design, fabricate, and assemble a relatively large robot for around €8,000 (US$11,000) including servo motors and electronics. That's about a third the cost of commercial robots in the same size category like the RQ-TITAN, and is still cheaper than smaller humanoids like the Aldebaran Robotics NAO. And best of all, they plan to make their design open source.



One of the ways they managed to keep the cost down is by using lightweight materials, which means the robot requires less powerful (and cheaper) servo motors. Standing 84 cm (33 in) tall, Poppy weighs in at just 3.5 kg (7.7 lb). The Sony QRIO, by comparison, was 26 cm (10 in) shorter yet it weighed twice as much. Still, most of the cost lies in the robot's 25 servo motors: it utilizes 21 high-end Robotis Dynamixel MX-28s, two MX-64s, and two AX-12s. It's powered by a Raspberry Pi, and is equipped with 16 force-sensing resistors, two HD cameras, a stereo microphone, and an inertial measurement unit. Poppy's "face" is an LCD screen which can be used to show emotions (or to debug).



Making a more human robot

One of the main reasons the INRIA Flowers (FLOWing Epigenetic Robots and Systems) team opted to build its own robot is because none of the available commercial kits are truly biologically inspired. By rapidly-prototyping their own robot, Poppy could challenge some of the usual robot design conventions.



To begin with, it has an articulated spine with five motors – almost unheard of in robots of this size, but one of the ongoing topics at INRIA Flowers since its first humanoid, ACROBAN, from several years ago. The spine not only allows Poppy to move more naturally, but helps to balance the robot by adjusting its posture. The added flexibility also helps when physically interacting with the robot, such as guiding it by its hands, which is currently required to help the robot walk.



Looking at the knees, you'll see some springs spanning the upper and lower leg joints. The tension in the springs helps to keep the supporting leg straight during each step without motorization. And farther down, its feet are smaller than most robots of Poppy's size, and its toes are thin, allowing them to bend. Rather than planting each foot parallel to the ground, which is how most robots this size walk, the toes help the robot achieve "heel to toe" walking. And as for those children's shoes Poppy wears? They're equipped with five pressure sensors on each sole, which provide useful data.



One of the more obvious deviations in design can be seen in its upper legs, which bend inwards at an angle of six degrees. Despite the fact that this more closely models the human femur, most humanoid robot designs have opted for straight leg linkages. By bending the thighs in, the distance between the two feet is shortened, moving the supporting leg's foot closer to the robot's center of gravity. And that makes it more stable when standing on one leg and when walking.



The INRIA team produced two versions of Poppy: one with straight thighs and another with the bent ones. Experiments showed that the robot with bent thighs swayed far less during its walking gait, making it much more stable. However, the robot still can't balance on its own, so for now it needs a human trainer.

In the future, the team hopes to get Poppy walking on its own, and plans to share its designs with other labs to promote more biologically-inspired humanoid robot designs.

SOURCE: http://www.gizmag.com/poppy-3d-print...bot-kit/29497/
__________________

Сталин liked this post
Ulpia-Serdica no está en línea   Reply With Quote
Old October 23rd, 2013, 01:20 AM   #96
Сталин
BANNED
 
Join Date: Dec 2011
Posts: 1,776
Likes (Received): 1124

It reminds me of an early version of the robots from Blade Runner.
Сталин no está en línea   Reply With Quote
Old November 3rd, 2013, 01:31 AM   #97
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

An insect-like, crash-happy flying robot



Gimball likes to make contact. In fact, this small ultralight flying spheroid resembles an insect as it goes around bumping into things. The goal of EPFL researchers was to develop a machine that could operate in extremely chaotic environments without the need for fragile sensors.

SOURCE: http://www.redorbit.com/news/video/s...obot-10302013/
Ulpia-Serdica no está en línea   Reply With Quote
Old November 6th, 2013, 04:22 PM   #98
Hebrewtext
Registered User
 
Hebrewtext's Avatar
 
Join Date: Aug 2004
Location: Tel Aviv - Yafo
Posts: 3,659
Likes (Received): 3766

__________________
Tel Aviv metropolitan area :

230+ towers built & U.C

129+ towers approved

100 towers proposed
Hebrewtext está en línea ahora   Reply With Quote
Old November 11th, 2013, 11:29 PM   #99
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

ABB Robotics - Introducing the IRB 6700: 7th Generation Large Robots



The new generation, IRB 6700 family, are the highest performance robots in the 150-300 kg class, and are designed to provide a life time of affordability and reliability. They have has 20 percent lower Total Cost of Ownership thanks to a more robust design, longer service intervals and simplified maintenance.
Ulpia-Serdica no está en línea   Reply With Quote
Old November 19th, 2013, 12:01 AM   #100
Ulpia-Serdica
Registered User
 
Join Date: Oct 2011
Posts: 8,324
Likes (Received): 7661

Meet REEM-C



Unlike its earlier bipeds, which were never mass produced or sold, PAL Robotics' third generation robot is available to purchase for use as a research platform. It follows a similar trajectory as KAIST's HUBO robots, which are now being sold to universities around the globe for hundreds of thousands of dollars apiece. According to PAL Robotics, REEM-C is ideal for exploring all sorts of robotics problems including walking, grasping, navigation, whole body control, human-robot interaction, and more.



REEM-C stands 165 cm tall (5 ft 4 in) tall and weighs 70 kg (154 lb), and has a total of 44 degrees of freedom. It walks at a maximum speed of just 28 cm/sec (about 1 ft per second), which is on the slow side, but because it's brand new that is likely to improve with further development.

It has a modular design, which gives some flexibility to parts, but comes equipped with two internal computers as well as stereo cameras, microphones, speakers, and other sensors (including laser range finders on its feet) as standard. Powered by a 48 V Lithium-Ion battery, it can perform for up to 3 hours with frequent movement or 6 hours on stand-by, and takes about 5 hours to fully charge (which is slightly better than its competition).

It's difficult to judge its value against similar humanoid research platforms without knowing how much it costs, and unfortunately PAL Robotics has not gotten back to me about that. However, I would expect it be somewhere in the range of AIST's HRP-4 (US$300k), the working counterpart to this android, and KAIST's HUBO 2 ($400k). Obviously it's not something your average Joe can afford, but prestigious universities (and royalty) may find it ticks all the boxes.

SOURCE: http://www.gizmag.com/pal-robotics-r...d-robot/29812/
Ulpia-Serdica no está en línea   Reply With Quote
Sponsored Links
Advertisement
 


Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



All times are GMT +2. The time now is 04:56 PM.


Powered by vBulletin® Version 3.8.11 Beta 4
Copyright ©2000 - 2019, vBulletin Solutions Inc.
vBulletin Security provided by vBSecurity v2.2.2 (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.
Feedback Buttons provided by Advanced Post Thanks / Like (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.

SkyscraperCity ☆ In Urbanity We trust ☆ about us