Tunzelbots

Eugénie von Tunzelmann: Ever since reading Richard Dawkins' book 'The Blind Watchmaker' I'd wanted to try my hand at some evolutionary programming. The idea is to model natural selection inside the computer by generating procedural creatures and allowing them to vary and improve over time without user intervention. The code to build and rig the robots was written in Python, as was the code to run the rigid body simulation, using the Open Dynamics Engine to drive the sim. I wrote an importer for Side Effects' Houdini to read in my robot simulations so I could render them out as pictures.

Fish On Wheels

From Studio diip : “Fish on Wheels” has been developed so fish can steer their tank into a certain direction. Our pet fish have always been limited to their water holding area known as “the fish tank”. In an attempt to liberate fish all over the world, the first self driving car for fish has been developed. This car moves by detecting the fish’s position with computer vision. Up until now driving vehicles has been limited to mankind only (excluding a handful of autonomous vehicles driven by computers), but now your pet fish can also put the pedal to the metal. A prototype version of  ”Fish on Wheels” has been constructed using a standard webcam, a battery powered Beagleboard and an Arduino controlled robot vehicle. Using the contrast of the fish with the bottom of the fish tank his position is determined and used to send commands to the Arduino for moving the car into that direction.

uArm

From the projects' kickstarter ($104,217 pledged of $5,000 goal): uArm is a 4-axis parallel-mechanism robot arm, inspired by the ABB PalletPack industrial robot arm IRB460. ($185 for complete black kit and a gripper) The basic design is Arduino-controlled with 4 degrees of freedom. Three servos on the base control the main movement of the arm and the mini servo on the top moves and rotates the object. The end-effector of the arm is always kept parallel to the ground. Right now we have already developed a Windows application that allows the uArm to be controlled with keyboard or mouse.  With some basic controlling skills, you can use basically any input device to control it, for example, we have also used other remote controller to control the arm. With our imbedded inverse-kinematics algorithm, the uArm can be precisely controlled using coordinates.  We have also written an Arduino library specifically for controlling the uArm. So if you are familiar with Arduino, you can program it directly with Arduino IDE. By calling different functions, you can easily move uArm to your desired position without doing tons of hard math... cont'd

ROS 101: Intro To The Robot Operating System

Clearpath Robotics has posted part one and two of their ongoing introductory to the ROS operating system: Part One: Intro Since we practically live in the Robot Operating System (ROS), we thought it was time to share some tips on how to get started with ROS. We’ll answer questions like where do I begin? How do I get started? What terminology should I brush up on? Keep an eye out for this ongoing ROS 101 blog series that will provide you with a top to bottom view of ROS that will focus on introducing basic concepts simply, cleanly and at a reasonable pace... cont'd Part Two: Setup And Example In the previous ROS 101 post, we provided a quick introduction to ROS to answer questions like What is ROS? and How do I get started? Now that you understand the basics, here’s how they can apply to a practical example. Follow along to see how we actually ‘do’ all of these things…. cont'd

faBrickation: Fast 3D Printing Using Bricks

Hasso-Plattner-Institut : faBrickation is a new approach to rapid prototyping of functional objects, such as the body of a head-mounted display. The key idea is to save 3D printing time by automatically substituting sub-volumes with standard building blocks — in our case Lego bricks. When making the body for a head-mounted display, for example, getting the optical path right is paramount. Users thus mark the lens mounts as “high-resolution” to indicate that these should later be 3D printed. faBrickator then 3D prints these parts. It also generates instructions that show users how to create everything else from Lego bricks.

Google to Buy Artificial Intelligence Startup DeepMind for $400M

From Recode: Google is shelling out $400 million to buy a secretive artificial intelligence company called DeepMind. Google confirmed the deal after Re/code inquired about it, but declined to specify a price. Based in London, DeepMind was founded by games prodigy and neuroscientist Demis Hassabis, along with Shane Legg and Mustafa Suleyman... cont'd Additionally, a recently published paper by DeepMind entitled  Playing Atari with Deep Reinforcement Learning .

Prosthesis The Anti-Robot

From the Prosthesis projects Indiegogo campaign : Prosthesis: the world's 1st, human controlled racing robot. Formula 1, meet the future. Let the races begin. We are trying to save the future for the humans. With the relentless and unchecked automation of everything we do, we are trying to remind people that technology was invented to improve our quality of life, and that doesn't always mean just doing everything for you. Sometimes that means doing something really, really challenging. Sometimes that means taking on something that many have dreamed of, but no one has dared try before. Like building and learning to pilot a two story tall, 3500kg walking machine that you use your whole body to control, without computers to help you.... cont'd at homepage and Indiegogo .  

New Video Of JPL's RoboSimian

Jet Propulsion Laboratory's Youtube Channel :

Robonaut 2 Gets Legs

NASA : NASA engineers are developing climbing legs for the International Space Station's robotic crew member Robonaut 2 (R2), marking another milestone in space humanoid robotics. The legless R2, currently attached to a support post, is undergoing experimental trials with astronauts aboard the orbiting laboratory. Since its arrival at the station in February 2011, R2 has performed a series of tasks to demonstrate its functionality in microgravity. These new legs, funded by NASA's Human Exploration and Operations and Space Technology mission directorates, will provide R2 the mobility it needs to help with regular and repetitive tasks inside and outside the space station. The goal is to free up the crew for more critical work, including scientific research.  

Parrot Announces MiniDrone And Jumping Sumo At CES

From  Parrot :

Rex: A Single-board Computer With A Full OS That Is Designed For Robots

From the Rex Kickstarter : Why do you want Rex? There are two general classes of electronics used in robot hardware: microcontrollers (ex. Arduino) and single-board computers. Microcontrollers are great for projects that only require a single program to be run, quickly and without overhead, like controlling LEDs and motors. Single-board computers are great for anything you'd need a cheap, small computer for - like networking applications and image processing. Advanced autonomous robots require the strengths of both. A system developed around Rex, being made specifically for robots, brings it all together in one nice little package in a way that has never been done before. Hardware Specs: Texas Instruments DM3730 1GHz 32-bit ARM Cortex-A8 Processor core 800MHz DSP core 512MB LPDDR RAM USB Host port MicroSD slot Camera Module port 3.5mm Audio-in jack 3.5mm Audio-out jack 5V DC input for desktop development Each Rex will come pre-installed with Alphalem OS, a FOSS custom linux distribution. It includes a core set of built-in device drivers - ones that we've hand-picked as being the most useful for robots (like USB WiFi adapters and cameras). We'll publish the list in a wiki on our website. Here are the other main features: An Arduino-style programming environment with support for multiple programming languages (C, C++, Python). A special task manager called the Master Control Program (MCP). An API for message passing in multi-process applications. A standard Linux filesystem which will allow you to install just about any Linux software that can be cross-compiled for ARM. Libraries for common processes such as I2C communication, face detection, and sensor reading.

DARPA Robotics Challenge Trials Live Broadcast

The DRC Trials are happening today and tomorrow (December 20-21, 2013) at the Homestead-Miami Speedway. Teams will attempt to guide their robots through eight individual, physical tasks that test mobility, manipulation, dexterity, perception, and operator control mechanisms; You can watch the live stream here.

The Factory-in-a-Day Project

From  Factory-in-a-Day's page : Small and medium-sized enterprises in Europe mostly refrain from using advanced robot technology. The EU-project Factory-in-a-Day aims to change this by developing a robotic system that can be set up and made operational in 24 hours and is flexible, leasable and cheap. The project has a budget of 11 million euros for four years, 7.9 million of which will be funded by the European Union as part of the FP7 programme ‘Factory of the Future’. The international consortium comprises 16 partners and the coordinating university is Delft University of Technology (TU Delft). The project will start on 8 October 2013 with a formal kick- off meeting in Delft. Within 24 hours The Factory-in-a-Day-project will provide a solution to these problems: a robot that can be set up and operational in 24 hours. SME companies can use the robot for a specific job and their staff can learn how to work closely together with the robot and thus optimize their production. “With the technological and organizational innovations of the Factory-in-a-Day project, we hope to fundamentally change the ways in which robots are used in the manufacturing world”, says project coordinator Martijn Wisse, Associate Professor at TU Delft. How does it work? What will such an installation day look like? First of all, before the robot is actually taken to the SME premises, a system integrator analyzes which steps in the process can be taken over by the robot. In most cases the repetitive work is done by the robot while the human worker carries out the more flexible, accurate tasks and deals with problem- solving. Customer-specific hardware-components are 3D-printed and installed on the grippers of the robot. The robot is then brought to the factory and set up, and any auxiliary components such as cameras are also set up in the unaltered production facilities. The robot will be connected to the machinery software through a brand-independent software system. After that, the robot is taught how to perform his set of tasks, for example how to grasp an object. Therefore, the operator will physically interact with the robot. A set of predefined skills will be available, rather like Apps for smart phones. Finally, the robot is operational and the human co-workers receive their training -- all in just 24 hours.

Google Puts Money on Robots, Using the Man Behind Android

New York Times: Over the last half-year, Google has quietly acquired seven technology companies in an effort to create a new generation of robots. And the engineer heading the effort is Andy Rubin, the man who built Google’s Android software into the world’s dominant force in smartphones.... ( full article )

Dynamic Probabilistic Volumetric Models

From Ali Osman Ulusoy, Octavian Biris, Joseph Mundy of Brown University: This paper presents a probabilistic volumetric frame- work for image based modeling of general dynamic 3-d scenes. The framework is targeted towards high quality modeling of complex scenes evolving over thousands of frames. Extensive storage and computational resources are required in processing large scale space-time (4-d) data. Existing methods typically store separate 3-d models at each time step and do not address such limitations. A novel 4-d representation is proposed that adaptively subdivides in space and time to explain the appearance of 3-d dynamic surfaces. This representation is shown to achieve compres- sion of 4-d data and provide efficient spatio-temporal pro- cessing. Theadvancesoftheproposedframeworkisdemon- strated on standard datasets using free-viewpoint video and 3-d tracking applications.... ( full paper )

Records 1306 to 1320 of 1550

First | Previous | Next | Last

Featured Product

Bota Systems - The SensONE 6-axis force torque sensor for robots

Bota Systems - The SensONE 6-axis force torque sensor for robots

Our Bota Systems force torque sensors, like the SensONE, are designed for collaborative and industrial robots. It enables human machine interaction, provides force, vision and inertia data and offers "plug and work" foll all platforms. The compact design is dustproof and water-resistant. The ISO 9409-1-50-4-M6 mounting flange makes integrating the SensONE sensor with robots extremely easy. No adapter is needed, only fasteners! The SensONE sensor is a one of its kind product and the best solution for force feedback applications and collaborative robots at its price. The SensONE is available in two communication options and includes software integration with TwinCAT, ROS, LabVIEW and MATLAB®.