机器人和机器人传感器中英文对照外文翻译文献

合集下载

传感器技术论文中英文对照资料外文翻译文献

传感器技术论文中英文对照资料外文翻译文献

传感器技术论文中英文对照资料外文翻译文献Development of New Sensor TechnologiesSensors are devices that can convert physical。

chemical。

logical quantities。

etc。

into electrical signals。

The output signals can take different forms。

such as voltage。

current。

frequency。

pulse。

etc。

and can meet the requirements of n n。

processing。

recording。

display。

and control。

They are indispensable components in automatic n systems and automatic control systems。

If computers are compared to brains。

then sensors are like the five senses。

Sensors can correctly sense the measured quantity and convert it into a corresponding output。

playing a decisive role in the quality of the system。

The higher the degree of n。

the higher the requirements for sensors。

In today's n age。

the n industry includes three parts: sensing technology。

n technology。

and computer technology。

机器人技术发展中英文对照外文翻译文献

机器人技术发展中英文对照外文翻译文献

机器人技术发展中英文对照外文翻译文献(文档含英文原文和中文翻译)外文资料:RobotsFirst, I explain the background robots, robot technology development. It should be said it is a common scientific and technological development of a comprehensive results, for the socio-economic development of a significant impact on a science and technology. It attributed the development of all countries in the Second World War to strengthen the economic input on strengthening the country's economic development. But they also demand the development of the productive forces the inevitable result of human development itself is the inevitable result then with the development of humanity, people constantly discuss the natural process, in understanding and reconstructing the natural process, people need to be able to liberate a slave. So this is the slave people to be able to replace the complex and engaged in heavy manual labor, People do not realize right up to the world's understanding and transformation of this technology as well as people in the development process of an objective need.Robots are three stages of development, in other words, we are accustomed to regarding robots are divided into three categories. is a first-generation robots, also known as teach-type robot, it is through a computer, to control over one of a mechanical degrees of freedom Through teaching and information stored procedures, working hours to read out information, and then issued a directive so the robot can repeat according to the people at that time said the results show this kind of movement again, For example, the car spot welding robots, only to put this spot welding process, after teaching, and it is always a repeat of a work It has the external environment is no perception that the force manipulation of the size of the work piece there does not exist, welding 0S It does not know, then this fact from the first generation robot, it will exist this shortcoming, it in the 20th century, the late 1970s, people started to study the second-generation robot, called Robot with the feeling that This feeling with the robot is similar in function of a certain feeling, for instance, force and touch, slipping, visual, hearing and who is analogous to that with all kinds of feelings, say in a robot grasping objects, In fact, it can be the size of feeling out, it can through visual, to be able to feel and identify its shape, size, color Grasping an egg, it adopted a acumen, aware of its power and the size of the slide.Third-generation robots, we were a robotics ideal pursued by the most advanced stage, called intelligent robots, So long as tell it what to do, not how to tell it to do, it will be able to complete the campaign, thinking and perception of this man-machine communication function and function Well, this current development or relative is in a smart part of the concept and meaning But the real significance of the integrity of this intelligent robot did not actually exist, but as we continued the development of science and technology, the concept of intelligent increasingly rich, it grows ever wider connotations.Now I have a brief account of China's robot development of the basic profiles. As our country there are many other factors that problem. Our country in robotics research of the 20th century the late 1970s. At that time, we organized at the national, a Japanese industrial automation products exhibition. In this meeting, there are two products, is a CNC machine tools, an industrial robot, this time, our country's many scholars see such a direction, has begun to make a robot research But this time, are basically confined to the theory of phase .Then the real robot research, in 7500 August 5, 1995, 15 nearly 20 years of development, The most rapid development, in 1986 we established a national plan of 863 high-technology development plan, As robot technology will be an important theme of the development of The state has invested nearly Jiganyi funds begun to make a robot, We made the robot in the field quickly and rapid development.At present, units like the CAS ShenYng Institute of Automation, the original machinery, automation of the Ministry, as of Harbin Industrial University, Beijing University of Aeronautics and Astronautics, Qinghua University, Chinese Academy of Sciences, also includes automation of some units, and so on have done a very important study, also made a lot of achievements Meanwhile, in recent years, we end up in college, a lot of flats in robot research, Many graduate students and doctoral candidates are engaged in robotics research, we are more representative national study Industrial robots, underwater robots, space robots, robots in the nuclear industry are on the international level should be taking the lead .On the whole of our country Compared with developed countries, there is still a big gap, primarily manifested in the We in the robot industry, at present there is no fixed maturity product, but in theseunderwater, space, the nuclear industry, a number of special robots, we have made a lot of achievements characteristics.Now, I would like to briefly outline some of the industrial robot situation. So far, the industrial robot is the most mature and widely used category of a robot, now the world's total sales of 1.1 million Taiwan, which is the 1999 statistics, however, 1.1 million in Taiwan have been using the equipment is 75 million, this volume is not small. Overall, the Japanese industrial robots in this one, is the first of the robots to become the Kingdom, the United States have developed rapidly. Newly installed in several areas of Taiwan, which already exceeds Japan, China has only just begun to enter the stage of industrialization, has developed a variety of industrial robot prototype and small batch has been used in production.Spot welding robot is the auto production line, improve production efficiency and raise the quality of welding car, reduce the labor intensity of a robot. It is characterized by two pairs of robots for spot welding of steel plate, bearing a great need for the welding tongs, general in dozens of kilograms or more, then its speed in meters per second a 5-2 meter of such high-speed movement. So it is generally five to six degrees of freedom, load 30 to 120 kilograms, the great space, probably expected that the work of a spherical space, a high velocity, the concept of freedom, that is to say, Movement is relatively independent of the number of components, the equivalent of our body, waist is a rotary degree of freedom We have to be able to hold his arm, Arm can be bent, then this three degrees of freedom, Meanwhile there is a wrist posture adjustment to the use of the three autonomy, the general robot has six degrees of freedom. We will be able to space the three locations, three postures, the robot fully achieved, and of course we have less than six degrees of freedom. Have more than six degrees of freedom robot, in different occasions the need to configure.The second category of service robots, with the development of industrialization, especially in the past decade, Robot development in the areas of application are continuously expanding, and now a very important characteristic, as we all know, Robot has gradually shifted from manufacturing to non-manufacturing and service industries, we are talking about the car manufacturer belonging to the manufacturing industry, However, the services sector including cleaning, refueling, rescue, rescue,relief, etc. These belong to the non-manufacturing industries and service industries, so here is compared with the industrial robot, it is a very important difference. It is primarily a mobile platform, it can move to sports, there are some arms operate, also installed some as a force sensor and visual sensors, ultrasonic ranging sensors, etc. It’s surrounding environment for the conduct of identification, to determine its campaign t o complete some work, this is service robot’s one of the basic characteristics.For example, domestic robot is mainly embodied in the example of some of the carpets and flooring it to the regular cleaning and vacuuming. The robot it is very meaningful, it has sensors, it can furniture and people can identify, It automatically according to a law put to the ground under the road all cleaned up. This is also the home of some robot performance.The medical robots, nearly five years of relatively rapid development of new application areas. If people in the course of an operation, doctors surgery, is a fatigue, and the other manually operated accuracy is limited. Some universities in Germany, which, facing the spine, lumbar disc disease, the identification, can automatically use the robot-aided positioning, operation and surgery Like the United States have been more than 1,000 cases of human eyeball robot surgery, the robot, also including remote-controlled approach, the right of such gastrointestinal surgery, we see on the television inside. a manipulator, about the thickness fingers such a manipulator, inserted through the abdominal viscera, people on the screen operating the machines hand, it also used the method of laser lesion laser treatment, this is the case, people would not have a very big damage to the human body.In reality, this right as a human liberation is a very good robots, medical robots it is very complex, while it is fully automated to complete all the work, there are difficulties, and generally are people to participate. This is America, the development of such a surgery Lin Bai an example, through the screen, through a remote control operator to control another manipulator, through the realization of the right abdominal surgery A few years ago our country the exhibition, the United States has been successful in achieving the right to the heart valve surgery and bypass surgery. This robot has in the area, caused a great sensation, but also, AESOP's surgical robot, In fact, it through some equipment to some of the lesions inspections, through amanipulator can be achieved on some parts of the operation Also including remotely operated manipulator, and many doctors are able to participate in the robot under surgery Robot doctor to include doctors with pliers, tweezers or a knife to replace the nurses, while lighting automatically to the doctor's movements linked, the doctor hands off, lighting went off, This is very good, a doctor's assistant.We regard this country excel, it should be said that the United States, Russia and France, in our nation, also to the international forefront, which is the CAS ShenYang Institute of Automation of developing successful, 6,000 meters underwater without cable autonomous underwater robot, the robot to 6,000 meters underwater, can be conducted without cable operations. His is 2000, has been obtained in our country one of the top ten scientific and technological achievements. This indicates that our country in this underwater robot, have reached the advanced international level, 863 in the current plan, the development of 7,000 meters underwater in a manned submersible to the ocean further development and operation, This is a great vote of financial and material resources.In this space robotics research has also been a lot of development. In Europe, including 16 in the United States space program, and the future of this space capsule such a scheme, One thing is for space robots, its main significance lies in the development of the universe and the benefit of mankind and the creation of new human homes, Its main function is to scientific investigation, as production and space scientific experiments, satellites and space vehicles maintenance and repair, and the construction of the space assembly. These applications, indeed necessary, for example, scientific investigation, as if to mock the ground some physical and chemical experiments do not necessarily have people sitting in the edge of space, because the space crew survival in the day the cost is nearly one million dollars. But also very dangerous, in fact, some action is very simple, through the ground, via satellite control robot, and some regularly scheduled completion of the action is actually very simple. Include the capsule as control experiments, some switches, buttons, simple flange repair maintenance, Robot can be used to be performed by robots because of a solar battery, then the robot will be able to survive, we will be able to work, We have just passed the last robot development on the application of the different areas ofapplication, and have seen the robots in industry, medical, underwater, space, mining, construction, service, entertainment and military aspects of the application .Also really see that the application is driven by the development of key technologies, a lack of demand, the robot can not, It is because people in understanding the natural transformation of the natural process, the needs of a wide range of robots, So this will promote the development of key technologies, the robot itself for the development of From another aspect, as key technology solutions, as well as the needs of the application, on the promotion of the robot itself a theme for the development of intelligent, and from teaching reappearance development of the current local perception of the second-generation robot, the ultimate goal, continuously with other disciplines and the development of advanced technology, the robot has become rich, eventually achieve such an intelligent robot mainstream.Robot is mankind's right-hand man; friendly coexistence can be a reliable friend. In future, we will see and there will be a robot space inside, as a mutual aide and friend. Robots will create the jobs issue. We believe that there would not be a "robot appointment of workers being laid off" situation, because people with the development of society, In fact the people from the heavy physical and dangerous environment liberated, so that people have a better position to work, to create a better spiritual wealth and cultural wealth.译文:机器人首先我介绍一下机器人产生的背景,机器人技术的发展,它应该说是一个科学技术发展共同的一个综合性的结果,同时,为社会经济发展产生了一个重大影响的一门科学技术,它的发展归功于在第二次世界大战中各国加强了经济的投入,就加强了本国的经济的发展。

机器人类外文文献翻译穿越深渊的机器人中英文翻译、外文翻译

机器人类外文文献翻译穿越深渊的机器人中英文翻译、外文翻译

英文原文The Abyss Transit System- James Cameron commissions the making of robots for a return to theTitanicBy Gary StixAt the beginning of the movie that made Leonardo DiCaprio a megastar, a camera-toting unmanned robot ventured into a cavernous hole in the wreck that sits on the bottom of the Atlantic, 12,640 feet from the surface. The 500-pound vehicle, christened Snoop Dog, could move only about 30 feet along a lower deck, hampered by its bulky two-inch-diameter tether hitched to a submarine that waited above. The amount of thrust needed to move its chunky frame stirred up a thick cloud. “The vehicle very quickly silted out the entire place and made imaging impossible,” director James Cameron recalls.But the eerie vista revealed by Snoop Dog on that 1995 expedition made Cameron hunger for more. He vowed to return one day with technology that could negotiate anyplace within the Titanic's interior.In the past six months two documentaries—one for IMAX movie theaters called Ghosts of the Abyss, the other, Expedition: Bismarck, for the DiscoveryChannel—demonstrated the fruits of a three-year effort that Cameron financed with $1.8 million of his own money to make this vision materialize. The payoff was two 70-pound robots, named after Blues Brothers Jake and Elwood, that had the full run of two of the world's most famous wrecks, the Titanic and the Bismarck, which they visited on separate expeditions.The person who took Jake and Elwood from dream to robot is Mike Cameron, James's brother, an aerospace engineer who once designed missiles and who also possesses a diverse background as a helicopter pilot, stunt photographer and stuntman. (Remember the corpse in the movie The Abyss, from whose mouth a crab emerges?) Giving the remotely operated vehicles freedom of movement required that they be much smaller than Snoop Dog and that the tether's width be tapered dramatically so as not to catch on vertical ship beams.Mike Cameron took inspiration from the wire-guided torpedoes used by the military that can travel for many miles. His team created vehicles operable to more than 20,000 feet (enough to reach as much as 85 percent of the ocean floor). The dimensions of the front of the robot are 16 inches high by 17 inches across, small enough to fit in a B deck window of the Titanic. The bots have an internal battery so that they do not need to be powered through a tether. Instead the tether—fifty-thousandths of an inch in diameter—contains optical fibers that relaycontrol signals from a manned submersible vehicle hovering outside and that also send video images in the other direction. The tether pays out from the robot, a design that prevents it from snagging on objects in the wreck.James Cameron thought the project would be a straightforward engineering task, not much harder than designing a new camera system. “This turned out to be a whole different order of magnitude,” he says. “There was no commercial off-the-shelf hardware that wo uld work in the vehicles. Everything had to be built from scratch.” If the team had known this early on, he added, “we wouldn't have bothered.” Water pressure on the cable that carried the optical fibers could create microscopic bends in the data pipe, completely cutting off the control signals from the submersibles. Dark Matter in Valencia, Calif. (Mike Cameron's company), had to devise a fluid-filled sheath around the fiber to displace the minuscule air pockets in the cable that could lead to the microbending.To save weight, the frame—similar to a monocoque body of a race car—was made up of small glass hollow spheres contained in an epoxy matrix. The thruster contained a large-diameter, slowly rotating blade with nozzles that diffused the propulsive flow, minimizing the churning that would otherwise disturb the caked silt.A high-resolution video camera, along with an infrared camera for navigation, was placed in the front of the craft along with three light-emitting-diode arrays for fill lighting and two quartz halogen lamps for spotlighting.The winter of 2001 marked a critical juncture. It was six months before dives to the Titanic could be safely attempted, and James had to determine whether to proceed or wait another year. “Mike was really, really negative on the idea, but I decided to go for it,” the director says. He felt he couldn't afford to wait longer and thought that a fixed deadline would focus the engineering staff at Dark Matter. Forhis part, Mike was contending with an unending series of design challenges. “It was such an overwhelming set of problems that I had very little confidence that certain parts would be solvable in the time we had,” Mike says.A few weeks before the dives commenced in the summer of 2001, the robots' lithium sulfur dioxode-based batteries caught fire while being tested in a pressure tank, destroying what was to have been a third robot. Mike wanted to delay the dives, but James found a supplier of another type of lithium battery and pressed ahead.At the dive site, Jake and Elwood took starring roles with their 2,000-foot tethers, exploring for the first time in about 90 years remote parts of the ships, including the engine room, the firemen's mess hall and the cabins of first-class passengers—even focusing in on a bowler hat, a brass headboard and an intact, upright glass decanter. The images lack the resolution and novel quality of the high-definition, three-dimensional IMAX images, the other major technological innovation of Ghostsof the Abyss. Jake and Elwood's discoveries, however, draw the viewers' interest because of what they convey of the Titanic's mystique. “You actually feel like you're out there in the wreck,” Mike says. He remembers his brother piloting the robots with the helicopter stick that had been installed in the Russian submersible from which the robots were launched. “Jim ended up being a cowboy pilot,” Mike says. “He was far more aggressive with the system than I was.”One scene in Ghosts of the Abyss reveals the tension that sometimes erupted between the brothers. James contemplates moving one of the robots through a cabin window that is still partially occluded by a shard of glass that could damage the vehicle or cut the data tether. When James declares that he is going to take Jake in, moviegoers can hear Mike pleading with his brother not to do it, ultimately relenting once the bot has negotiated the opening.The decision to install a new type of battery at the last minute came to haunt the expedition; Elwood's lithium-polymer battery ignited while in the bowels of the ship. James manipulated the remaining robot into the Titanic to perform a rescue operation by hooking a cord to the grill of the dead bot and towing it out. At the surface—on the deck of the Russian scientific vessel the Keldysh, from which the two submarines carrying Jake and Elwood to the Titanic were launched—Mike rebuilt Elwood with a backup battery. During the next dive, the robot caught fire again while it was still mounted on the submarine, endangering the crew. Finally, Mike worked for an 18-hour stretch to adapt a lead-acid gel battery used for devices onboard the mother ship into a power source for Elwood, enabling the expedition to continue.The bots, now fitted with a new, nonflammable battery that Mike designed, may find service beyond motion pictures. The U.S. Navy has funded Dark Matter to help it assess the technology for underwater recovery operations of ships or aircraft. The bots also have potential for scientific exploration of deep-sea trenches. After traveling to the Titanic and the Bismarck, the team went on to probe mid-Atlantic hydrothermal vents, discovering mollusks in a place where scientists had never encountered them before. As adventure aficionados, the brothers speculate that a descendant of Jake and Elwood might even be toted on a mission to Europa, one of Jupiter's moons, to investigate the waters that are suspected to exist below its icy shell. The Cameron siblings, who tinkered with home-built rafts and rockets as children in Ontario near Niagara Falls, hope to be around long enough to witness their robotic twins go from the bottom of the ocean to the depths of space.中文译文穿越深渊的机器--新型的机器人可在数百公尺深的水底残骸间自由穿梭游览作者╱斯蒂克斯( Gary Stix )曾一举捧红超级巨星李奥纳多狄卡皮欧的电影「铁达尼号」中,片头是一台无人驾驶的遥控装置,携带着摄影机深入大西洋,在3852公尺深的铁达尼号残骸里冒险的画面。

机器人外文文献翻译、中英文翻译

机器人外文文献翻译、中英文翻译

外文资料robotThe industrial robot is a tool that is used in the manufacturing environment to increase productivity. It can be used to do routine and tedious assembly line jobs,or it can perform jobs that might be hazardous to the human worker . For example ,one of the first industrial robot was used to replace the nuclear fuel rods in nuclear power plants. A human doing this job might be exposed to harmful amounts of radiation. The industrial robot can also operate on the assembly line,putting together small components,such as placing electronic components on a printed circuit board. Thus,the human worker can be relieved of the routine operation of this tedious task. Robots can also be programmed to defuse bombs,to serve the handicapped,and to perform functions in numerous applications in our society.The robot can be thought of as a machine that will move an end-of-tool ,sensor ,and/or gripper to a preprogrammed location. When the robot arrives at this location,it will perform some sort of task .This task could be welding,sealing,machine loading ,machine unloading,or a host of assembly jobs. Generally,this work can be accomplished without the involvement of a human being,except for programming and for turning the system on and off.The basic terminology of robotic systems is introduced in the following:1. A robot is a reprogrammable ,multifunctional manipulator designed to move parts,material,tool,or special devices through variable programmed motions for the performance of a variety of different task. This basic definition leads to other definitions,presented in the following paragraphs,that give acomplete picture of a robotic system.2. Preprogrammed locations are paths that the robot must follow to accomplish work,At some of these locations,the robot will stop and perform some operation ,such as assembly of parts,spray painting ,or welding .These preprogrammed locations are stored in the robot’s memory and are recalled later for continuousoperation.Furthermore,these preprogrammed locations,as well as other program data,can be changed later as the work requirements change.Thus,with regard to this programming feature,an industrial robot is very much like a computer ,where data can be stoned and later recalled and edited.3. The manipulator is the arm of the robot .It allows the robot to bend,reach,and twist.This movement is provided by the manipulator’s axes,also called the degrees of freedom of the robot .A robot can have from 3 to 16 axes.The term degrees of freedom will always relate to the number of axes found on a robot.4. The tooling and frippers are not part the robotic system itself;rather,they are attachments that fit on the end of the robot’s arm. These attachments connected to the end of the robot’s arm allow the robot to lift parts,spot-weld ,paint,arc-weld,drill,deburr,and do a variety of tasks,depending on what is required of the robot.5. The robotic system can control the work cell of the operating robot.The work cell of the robot is the total environment in which the robot must perform itstask.Included within this cell may be the controller ,the robot manipulator ,a work table ,safety features,or a conveyor.All the equipment that is required in order for the robot to do its job is included in the work cell .In addition,signals from outside devices can communicate with the robot to tell the robot when it should parts,pick up parts,or unload parts to a conveyor.The robotic system has three basic components: the manipulator,the controller,and the power source.A.ManipulatorThe manipulator ,which does the physical work of the robotic system,consists of two sections:the mechanical section and the attached appendage.The manipulator also has a base to which the appendages are attached.Fig.1 illustrates the connectionof the base and the appendage of a robot.图1.Basic components of a robot’s manipulatorThe base of the manipulator is usually fixed to the floor of the work area. Sometimes,though,the base may be movable. In this case,the base is attached to either a rail or a track,allowing the manipulator to be moved from one location to anther.As mentioned previously ,the appendage extends from the base of the robot. The appendage is the arm of the robot. It can be either a straight ,movable arm or a jointed arm. The jointed arm is also known as an articulated arm.The appendages of the robot manipulator give the manipulator its various axes of motion. These axes are attached to a fixed base ,which,in turn,is secured to a mounting. This mounting ensures that the manipulator will in one location.At the end of the arm ,a wrist(see Fig 2)is connected. The wrist is made up of additional axes and a wrist flange. The wrist flange allows the robot user to connect different tooling to the wrist for different jobs.图2.Elements of a work cell from the topThe manipulator’s axes allow it to perform work within a certain area. The area is called the work cell of the robot ,and its size corresponds to the size of the manipulator.(Fid2)illustrates the work cell of a typical assembly ro bot.As the robot’s physical size increases,the size of the work cell must also increase.The movement of the manipulator is controlled by actuator,or drive systems.The actuator,or drive systems,allows the various axes to move within the work cell. The drive system can use electric,hydraulic,or pneumatic power.The energy developed by the drive system is converted to mechanical power by various mechanical power systems.The drive systems are coupled through mechanical linkages.These linkages,in turn,drive the different axes of the robot.The mechanical linkages may be composed of chain,gear,and ball screws.B.ControllerThe controller in the robotic system is the heart of the operation .The controller stores preprogrammed information for later recall,controls peripheral devices,and communicates with computers within the plant for constant updates in production.The controller is used to control the robot manipulator’s movements as well as to control peripheral components within the work cell. The user can program the movements of the manipulator into the controller through the use of a hard-held teach pendant.This information is stored in the memory of the controller for later recall.The controller stores all program data for the robotic system.It can store several differentprograms,and any of these programs can be edited.The controller is also required to communicate with peripheral equipment within the work cell. For example,the controller has an input line that identifies when a machining operation is completed.When the machine cycle is completed,the input line turn on telling the controller to position the manipulator so that it can pick up the finished part.Then ,a new part is picked up by the manipulator and placed into the machine.Next,the controller signals the machine to start operation.The controller can be made from mechanically operated drums that step through a sequence of events.This type of controller operates with a very simple robotic system.The controllers found on the majority of robotic systems are more complex devices and represent state-of-the-art eletronoics.That is,they are microprocessor-operated.these microprocessors are either 8-bit,16-bit,or 32-bit processors.this power allows the controller to be very flexible in its operation.The controller can send electric signals over communication lines that allow it to talk with the various axes of the manipulator. This two-way communication between the robot manipulator and the controller maintains a constant update of the end the operation of the system.The controller also controls any tooling placed on the end of the robot’s wrist.The controller also has the job of communicating with the different plant computers. The communication link establishes the robot as part a computer-assisted manufacturing (CAM)system.As the basic definition stated,the robot is a reprogrammable,multifunctional manipulator.Therefore,the controller must contain some of memory stage. The microprocessor-based systems operates in conjunction with solid-state devices.These memory devices may be magnetic bubbles,random-access memory,floppy disks,or magnetic tape.Each memory storage device stores program information fir or for editing.C.power supplyThe power supply is the unit that supplies power to the controller and the manipulator. The type of power are delivered to the robotic system. One type of power is the AC power for operation of the controller. The other type of power isused for driving the various axes of the manipulator. For example,if the robot manipulator is controlled by hydraulic or pneumatic drives,control signals are sent to these devices causing motion of the robot.For each robotic system,power is required to operate the manipulator .This power can be developed from either a hydraulic power source,a pneumatic power source,or an electric power source.There power sources are part of the total components of the robotic work cell.中文翻译机器人工业机器人是在生产环境中用以提高生产效率的工具,它能做常规乏味的装配线工作,或能做那些对于工人来说是危险的工作,例如,第一代工业机器人是用来在核电站中更换核燃料棒,如果人去做这项工作,将会遭受有害放射线的辐射。

一个有关移动机器人定位的视觉传感器模型外文文献翻译、中英文翻译

一个有关移动机器人定位的视觉传感器模型外文文献翻译、中英文翻译

XX设计(XX)外文资料翻译A Visual-Sensor Model for Mobile Robot Localisation Matthias Fichtner Axel Gro_mannArti_cial Intelligence InstituteDepartment of Computer ScienceTechnische Universit•at DresdenTechnical Report WV-03-03/CL-2003-02AbstractWe present a probabilistic sensor model for camera-pose estimation in hallways and cluttered o_ce environments. The model is based on the comparison of features obtained from a given 3D geometrical model of the environment with features present in the camera image. The techniques involved are simpler than state-of-the-art photogrammetric approaches. This allows the model to be used in probabilistic robot localisation methods. Moreover, it is very well suited for sensor fusion. The sensor model has been used with Monte-Carlo localisation to track the position of a mobile robot in a hallway navigation task. Empirical results are presented for this application.1 IntroductionThe problem of accurate localisation is fundamental to mobile robotics. To solve complex tasks successfully, an autonomous mobile robot has to estimate its current pose correctly and reliably. The choice of the localization method generally depends on the kind and number of sensors, the prior knowledge about the operating environment, and the computing resources available. Recently, vision-based navigation techniques have become increasingly popular [3]. Among the techniques for indoor robots, we can distinguish methods that were developed in the _eld of photogrammetry and computer vision, and methods that have their origin in AI robotics.An important technical contribution to the development of vision-based navigation techniques was the work by [10] on the recognition of 3D-objects from unknown viewpoints in single images using scale-invariant features. Later, this technique was extended to global localisation and simultaneous map building [11].The FINALE system [8] performed position tracking by using a geometrical model of the environment and a statistical model of uncertainty in the robot's pose given the commanded motion. The robot's position is represented by a Gaussian distribution and updated by Kalman _ltering. The search for corresponding features in camera image and world model is optimized by projecting the pose uncertainty into the camera image.Monte Carlo localisation (MCL) based on the condensation algorithm has been applied successfully to tour-guide robots [1]. This vision-based Bayesian _ltering technique uses a sampling-based density representation. In contrast to FINALE, it canrepresent multi-modal probability distributions. Given a visual map of the ceiling, it localises the robot globally using a scalar brightness measure. [4] presented avision-based MCL approach that combines visual distance features and visual landmarks in a RoboCup application. As their approach depends on arti_cial landmarks, it is not applicable in o_ce environments.The aim of our work is to develop a probabilistic sensor model for camerapose estimation. Given a 3D geometrical map of the environment, we want to find an approximate measure of the probability that the current camera image has been obtained at a certain place in the robot's operating environment. We use this sensor model with MCL to track the position of a mobile robot navigating in a hallway. Possibly, it can be used also for localization in cluttered o_ce environments and for shape-based object detection.On the one hand, we combine photogrammetric techniques for map-based feature projection with the exibility and robustness of MCL, such as the capability to deal with localisation ambiguities. On the other hand, the feature matching operation should be su_ciently fast to allow sensor fusion. In addition to the visual input, we want to use the distance readings obtained from sonars and laser to improve localisation accuracy.The paper is organised as follows. In Section 2, we discuss previous work. In Section 3, we describe the components of the visual sensor model. In Section 4, we present experimental results for position tracking using MCL. We conclude in Section 5.2 Related WorkIn classical approaches to model-based pose determination, we can distinguish two interrelated problems. The correspondence problem is concerned with _nding pairs of corresponding model and image features. Before this mapping takes place, the model features are generated from the world model using a given camera pose. Features are said to match if they are located close to each other. Whereas the pose problem consists of _nding the 3D camera coordinates with respect to the origin of the world model given the pairs of corresponding features [2]. Apparently, the one problem requires the other to be solved beforehand, which renders any solution to the coupled problem very di_cult [6].The classical solution to the problem above follows a hypothesise-and-test approach: (1)Given a camera pose estimate, groups of best matching feature pairs provideinitial guesses (hypotheses).(2)For each hypothesis, an estimate of the relative camera pose is computed byminimising a given error function de_ned over the associated feature pairs. (3)Now as there is a more accurate pose estimate available for each hypothesis, theremaining model features are projected onto the image using the associatedcamera pose. The quality of the match is evaluated using a suitable error function, yielding a ranking among all hypotheses.(4)The highest-ranking hypothesis is selected.Note that the correspondence problem is addressed by steps (1) and (3), and the poseproblem by (2) and (4).The performance of the algorithm will depend on the type of features used, e.g., edges, line segments, or colour, and the choice of the similarity measure between image and model, here referred to as error function. Line segments is the feature type of our choice as they can be detected comparatively reliably under changing illumination conditions. As world model, we use a wire-frame model of the operating environment, represented in VRML. The design of a suitable similarity measure is far more difficult.In principle, the error function is based on the di_erences in orientation between corresponding line segments in image and model, their distance and difference in length, in order of decreasing importance, in consideration of all feature pairs present. This has been established in the following three common measures [10]. e3D is defined by the sum of distances between model line endpoints and the corresponding plane given by camera origin and image line. This measure strongly depends on the distance to the camera due to back-projection. e2D;1, referred to as in_nite image lines, is the sum over the perpendicular distances of projected model line endpoints to corresponding, in_nitely extended lines in the image plane. The dual measure, e2D;2, referred to as in_nite model lines, is the sum over all distances of image line endpoints to corresponding, in_nitely extended model lines in the image plane.To restrict the search space in the matching step, [10] proposed to constrain the number of possible correspondences for a given pose estimate by combining line features into perceptual, quasi-invariant structures beforehand. Since these initial correspondences are evaluated by e2D;1 and e2D;2, high demands are imposed on the accuracy of the initial pose estimate and the image processing operations, includingthe removal of distortions and noise and the feature extraction. It is assumed to obtain all visible model lines at full length. [12, 9] demonstrated that a few outliers already can severely affect the initial correspondences in Lowe's original approach due to frequent truncation of lines caused by bad contrast, occlusion, or clutter.3 Sensor ModelOur approach was motivated by the question whether a solution to the correspondence problem can be avoided in the estimation of the camera pose. Instead, we propose to perform a relatively simple, direct matching of image and model features. We want to investigate the level of accuracy and robustness one can achieve this way.The processing steps involved in our approach are depicted in Figure 1. After removing the distortion from the camera image, we use the Canny operator to extract edges. This operator is relatively tolerant to changing illumination conditions. From the edges, line segments are identi_ed. Each line is represented as a single point (_; _) in the 2D Hough space given by _ = x cos _ + y sin _. The coordinates of the end points are neglected. In this representation, truncated or split lines will have similar coordinates in the Hough space. Likewise, the lines in the 3D map are projected onto the image plane using an estimate of the camera pose and taking into account the visibility constraints, and are represented as coordinates in the Hough space as well. We have designed several error functions to be used as similarity measure in the matching step. They are described in the following.Centred match count (CMC)The first similarity measure is based on the distance of line segments in the Hough space. We consider only those image features as possible matches that lie within a rectangular cell in the Hough space centred around the model feature. The matches are counted and the resulting sum is normalised. The mapping from the expectation (model features) to the measurement (image features) accounts for the fact that the measure should be invariant with respect to objects not modelled in the 3D map or unexpected changes in the operating environment. Invariance of the number of visible features is obtained by normalisation. Speci_cally, the centred match count measure sCMC is defined by:where the predicate p de_nes a valid match using the distance parameters (t_; t_) and the operator # counts the number of matches. Generally speaking, this similarity measure computes the proportion of expected model Hough points hei 2 He that are con_rmed by at least one measured image Hough point hmj 2 Hm falling within tolerance (t_; t_). Note that neither endpoint coordinates nor lengths are considered here.Grid length match (GLM)The second similarity measure is based on a comparison of the total length values of groupes of lines. Split lines in the image are grouped together using a uniform discretisation of the Hough space. This method is similar to the Hough transform for straight lines. The same is performed for line segments obtained from the 3D model. Let lmi;j be the sum of lengths of measured lines in the image falling into grid cell (i; j), likewise lei;j for expected lines according to the model, then the grid length match measure sGLM is de_ned as:For all grid cells containing model features, this measure computes the ratio of the total line length of measured and expected lines. Again, the mapping is directional, i.e., the model is used as reference, to obtain invariance of noise, clutter, and dynamic objects.Nearest neighbour and Hausdorf distanceIn addition, we experimented with two generic methods for the comparison of two sets of geometric entities: the nearest neighbour and the Hausdor_ distance. For details see [7]. Both rely on the de_nition of a distance function, which we based on the coordinates in the Hough space, i.e., the line parameter _ and _, and optionally the length, in a linear and exponential manner. See [5] for a complete description. Common error functionsFor comparisons, we also implemented the commonly used error functions e3D,e2D;1, and e2D;2. As they are de_ned in the Cartesian space, we represent lines in the Hessian notation, x sin _ y cos _ = d. Using the generic error function f, we de_ned the similarity measure as:where M is the set of measured lines and E is the set of expected lines. In case ofe2D;1, f is de_ned by the perpendicular distances between both model line endpoints, e1, e2, and the in_nitely extended image line m:Likewise, the dual similarity measure, using e2D;2, is based on the perpendicular distances between the image line endpoints and the in_nitely extended model line. Recalling that the error function e3D is proportional to the distances of model line endpoints to the view plane through an image line and the camera origin, we can instantiate Equation 1 using f3D(m; e) de_ned as:where ~nm denotes the normal vector of the view plane given by the image endpoints ~mi = [mx;my;w]T in camera coordinates.Obtaining probabilitiesIdeally, we want the similarity measure to return monotonically decreasing values as the pose estimate used for projecting the model features departs from the actual camera pose. As we aim at a generally valid yet simple visual-sensor model, the idea is to abstract from speci_c poses and environmental conditions by averaging over a large number of di_erent, independent situations. For commensurability, we want to express the model in terms of relative robot coordinates instead of absolute world coordinates. In other words, we assumeto hold, i.e., the probability for the measurement m, given the pose lm this image has been taken at, the pose estimate le, and the world model w, is equal to the probability of this measurement given a three-dimensional pose deviation 4l and the world model w.The probability returned by the visual-sensor model is obtained by simple scaling:4 Experimental ResultsWe have evaluated the proposed sensor model and similarity measures in a series of experiments. Starting with arti_cially created images using idealized conditions, we have then added distortions and noise to the tested images. Subsequently, we have used real images from the robot's camera obtained in a hallway. Finally, we have usedthe sensor model to track the position of the robot while it was travelling through the hallway. In all these cases, a three-dimensional visualisation of the model was obtained, which was then used to assess the solutions.Simulations using arti_cially created imagesAs a first kind of evaluation, we generated synthetic image features by generating a view at the model from a certain camera pose. Generally speaking, we duplicated the right-hand branch of Figure 1 onto the left-hand side. By introducing a pose deviation 4l, we can directly demonstrate its inuence on the similarity values. For visualisation purposes, the translational deviations 4x and 4y are combined into a single spatial deviation 4t. Initial experiments have shown only insigni_cant di_erences when they were considered independently.Fig. 2: Performance of CMC on arti_cially created images.For each similarity measure given above, at least 15 million random camera poses were coupled with a random pose deviation within the range of 4t < 440cm and 4_ < 90_ yielding a model pose.The results obtained for the CMC measure are depicted in Figure 2. The surface of the 3D plot was obtained using GNUPLOT's smoothing operator dgrid3d. We notice a unique, distinctive peak at zero deviation with monotonically decreasing similarity values as the error increases. Please note that this simple measure considers neither endpoint coordinates nor lengths of lines. Nevertheless, we obtain already a decent result.While the resulting curve for the GLM measure resembles that of CMC, the peak is considerably more distinctive. This conforms to our anticipation since taking the length of image and model lines into account is very signi_cant here. In contrast to the CMC measure, incidental false matches are penalised in this method, due to the differing lengths.The nearest neighbour measure turned out to be not of use. Although linear and exponential weighting schemes were tried, even taking the length of line segmentsinto account, no distinctive peak was obtained, which caused its exclusion from further considerations.The measure based on the Hausdor_ distance performed not as good as the first two, CMC and GLM, though it behaved in the desired manner. But its moderate performance does not pay off the longest computation time consumed among all presented measures and is subsequently disregarded.So far, we have shown how our own similarity measures perform. Next, we demonstrate how the commonly used error functions behave in this framework.The function e2D;1 performed very well in our setting. The resulting curve closely resembles that of the GLM measure. Both methods exhibit a unique, distinctive peak at the correct location of zero pose deviation. Note that the length of line segments has a direct e_ect on the similarity value returned by measure GLM, while this attribute implicitly contributes to the measure e2D;1, though both linearly. Surprisingly, the other two error functions e2D;2 and e3D performed poorly.Toward more realistic conditionsIn order to learn the e_ect of distorted and noisy image data on our sensor model, we conducted another set of experiments described here. To this end, we applied the following error model to all synthetically generated image features before they are matched against model features. Each original line is duplicated with a small probability (p = 0:2) and shifted in space. Any line longer than 30 pixel is split with probability p=0:3. A small distortion is applied to the parameters (_; _; l) of each line according to a random, zeromean Gaussian. Furthermore, features not present in the model and noise are simulated by adding random lines uniformly distributed in the image. Hereof, the orientation is drawn according to the current distribution of angles to yield fairly `typical' features.The results obtained in these simulations do not di_er significantly from the first set of experiments. While the maximum similarity value at zero deviation decreased, the shape and characteristics of all similarity measures still under consideration remained the same.Using real images from the hallwaySince the results obtained in the simulations above might be questionable with respect to real-world conditions, we conducted another set of experiments replacing the synthetic feature measurements by real camera images.To compare the results for various parameter settings, we gathered images with a Pioneer 2 robot in the hallway o_-line and recorded the line features. For two di_erent locations in the hallway exemplifying typical views, the three-dimensional space of the robot poses (x; y; _) was virtually discretized. After placing the robot manually at each vertex (x; y; 0), it performed a full turn on the spot stepwise recording images. This ensures a maximum accuracy of pose coordinates associated with each image. That way, more than 3200 images have been collected from 64 di_erent (x; y)locations. Similarly to the simulations above, pairs of poses (le; lm) were systematically chosenFig. 3: Performance of GLM on real images from the hallway.from with the range covered by the measurements. The values computed by the sensor model referring to the same discretized value of pose deviation 4l were averaged according to the assumption in Equation 2.The resulting visualisation of the similarity measure over spatial (x and y combined) and rotational deviations from the correct camera pose for the CMC measure exhibits a unique peak at approximately zero deviation. Of course, due to a much smaller number of data samples compared to the simulations using synthetic data, the shape of the curve is much more bumpy, but this is in accordance with our expectation.The result of employing the GLM measure in this setting is shown in Figure 3. As it reveals a more distinctive peak compared to the curve for the CMC measure, it demonstrates the increased discrimination between more and less similar feature maps when taking the lengths of lines into account.Monte Carlo Localisation using the visual-sensor modelRecalling that our aim is to devise a probabilistic sensor model for a camera mounted on a mobile robot, we continue with presenting the results for an application to mobile robot localisation.The generic interface of the sensor model allows it to be used in the correction step of Bayesian localisation methods, for example, the standard version of the Monte Carlolocalisation (MCL) algorithm. Since statistical independence among sensor readings renders one of the underlying assumptions of MCL, our hope is to gain improved accuracy and robustness using the camera instead of or in addition to commonly used distance sensors like sonars or laser.Fig. 4: Image and projected models during localisation.In the experiment, the mobile robot equipped with a _xed-mounted CCD camera had to follow a pre-programmed route in the shape of a double loop in the corridor. On its way, it had to stop at eight pre-de_ned positions, turn to a nearby corner or open view, take an image, turn back and proceed. Each image capture initiated the so-called correction step of MCL and the weights of all samples were recomputed according to the visual-sensor model, yielding the highest density of samples at the potentially correct pose coordinates in the following resampling step. In the prediction step, the whole sample set is shifted in space according to the robot's motion model and the current odometry sensor readings.Our preliminary results look very promising. During the position tracking experiments, i.e., the robot was given an estimate of its starting position, the best hypothesis for the robot's pose was approximately at the correct pose most of the time. In this experiment, we have used the CMC measure. In Figure 4, a typical camera view is shown while the the robots follows the requested path. The grey-level image depicts the visual input for feature extraction after distortion removal andpre-processing. Also the extracted line features are displayed. Furthermore, the world model is projected according to two poses, the odometry-tracked pose and the estimate computed by MCL which approximately corresponds to the correct pose, between which we observe translational and rotational error.The picture also shows that rotational error has a strong inuence on the degree ofcoincidental feature pairs. This effect corresponds to the results presented above, where the figures exhibit a much higher gradient along the axis of rotational deviation than along that of translational deviation. The finding can be explained by the effect of motion on features in the Hough space. Hence, the strength of our camera sensor model lays at detecting rotational disagreement. This property makes it especially suitable for two-wheel driven robots like our Pioneer bearing a much higher rotational odometry error than translational error.5 Conclusions and Future WorkWe have presented a probabilistic visual-sensor model for camera-pose estimation. Its generic design makes it suitable for sensor fusion with distance measurements perceived from other sensors. We have shown extensive simulations under ideal and realistic conditions and identified appropriate similarity measures. The application of the sensor model in a localisation task for a mobile robot met our anticipations. Within the paper we highlighted much scope for improvements.We are working on suitable techniques to quantitatively evaluate the performanceof the devised sensor model in a localisation algorithm for mobile robots. This will enable us to experiment with cluttered environments and dynamic objects. Combining the camera sensor model with distance sensor information using sensor fusion renders the next step toward robust navigation. Because the number of useful features varies significantly as the robots traverses an indoor environment, the idea to steer the camera toward richer views (active vision) offers a promising research path to robust navigation.References[1] F. Dellaert, W. Burgard, D. Fox, and S. Thrun. Using the condensationalgorithm for robust, vision-based mobile robot localisation. In Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, 1999.[2] D. DeMenthon, P. David, and H. Samet. SoftPOSIT: An algorithm forregistration of 3D models to noisy perspective images combining Softassign and POSIT. Technical report, University of Maryland, MD, 2001.[3] G. N. DeSouza and A. C. Kak. Vision for mobile robot navigation: A survey.IEEE Trans. on Pattern Analysis and Machine Intelligence, 24(2):237{267,2002.[4] S. Enderle, M. Ritter, D. Fox, S. Sablatn•og, G. Kraetzschmar, and G. Palm.Soccer-robot localisation using sporadic visual features. In IntelligentAutonomous Systems 6, pages 959{966. IOS, 2000.[5] M. Fichtner. A camera sensor model for sensor fusion. Master's thesis, Dept. ofComputer Science, TU Dresden, Germany, 2002.[6] S. A. Hutchinson, G. D. Hager, and P. I. Corke. A tutorial on visual servocontrol. IEEE Trans. on Robotics and Automation, 12(5):651{ 670, 1996.[7] D. P. Huttenlocher, G. A. Klanderman, and W. J. Rucklidge. Comparing imagesusing the Hausdor_ distance. IEEE Trans. on Pattern Analysis and MachineIntelligence, 15(9):850{863, 1993.[8] A. Kosaka and A. C. Kak. Fast vision-guided mobile robot navigation usingmodel-based reasoning and prediction of uncertainties. Com- puter Vision,Graphics, and Image Processing { Image Understanding, 56(3):271{329, 1992. [9] R. Kumar and A. R. Hanson. Robust methods for estimating pose and asensitivity analysis. Computer Vision, Graphics, and Image Processing { Image Understanding, 60(3):313{342, 1994.[10] D. G. Lowe. Three-dimensional object recognition from single twodimensionalimages. Arti_cial Intelligence, 31(3):355{395, 1987.[11] S. Se, D. G. Lowe, and J. Little. Vision-based mobile robot localization andmapping using scale-invariant features. In Proc. of the IEEE Int. Conf. onRobotics and Automation, pages 2051{2058, 2001.[12] G. D. Sullivan, L. Du, and K. D. Baker. Quantitative analysis of the viewpointconsistency constraint in model-based vision. In Proc. of the 4th Int. IEEE Conf.on Computer Vision, pages 632{639, 1993.13一个有关移动机器人定位的视觉传感器模型Matthias Fichtner Axel Gro_mannArti_cial Intelligence InstituteDepartment of Computer ScienceTechnische Universit•at DresdenTechnical Report WV-03-03/CL-2003-02摘要我们提出一个在走廊和传感器模型凌乱的奥西环境下的概率估计。

传感器的基础知识中英文对照外文翻译文献

传感器的基础知识中英文对照外文翻译文献

中英文对照外翻译Basic knowledge of transducersA transducer is a device which converts the quantity being measured into an optical, mechanical, or-more commonly-electrical signal. The energy-conversion process that takes place is referred to as transduction.Transducers are classified according to the transduction principle involved and the form of the measured. Thus a resistance transducer for measuring displacement is classified as a resistance displacement transducer. Other classification examples are pressure bellows, force diaphragm, pressure flapper-nozzle, and so on.1、Transducer ElementsAlthough there are exception ,most transducers consist of a sensing element and a conversion or control element. For example, diaphragms,bellows,strain tubes and rings, bourdon tubes, and cantilevers are sensing elements which respond to changes in pressure or force and convert these physical quantities into a displacement. This displacement may then be used to change an electrical parameter such as voltage, resistance, capacitance, or inductance. Such combination of mechanical and electrical elements form electromechanical transducing devices or transducers. Similar combination can be made for other energy input such as thermal. Photo, magnetic and chemical,giving thermoelectric, photoelectric,electromaanetic, and electrochemical transducers respectively.2、Transducer SensitivityThe relationship between the measured and the transducer output signal is usually obtained by calibration tests and is referred to as the transducer sensitivity K1= output-signal increment / measured increment . In practice, the transducer sensitivity is usually known, and, by measuring the output signal, the input quantity is determined from input= output-signal increment / K1.3、Characteristics of an Ideal TransducerThe high transducer should exhibit the following characteristicsa) high fidelity-the transducer output waveform shape be a faithful reproduction of the measured; there should be minimum distortion.b) There should be minimum interference with the quantity being measured; the presence of the transducer should not alter the measured in any way.c) Size. The transducer must be capable of being placed exactly where it is needed.d) There should be a linear relationship between the measured and the transducer signal.e) The transducer should have minimum sensitivity to external effects, pressure transducers,for example,are often subjected to external effects such vibration and temperature.f) The natural frequency of the transducer should be well separated from the frequency and harmonics of the measurand.4、Electrical TransducersElectrical transducers exhibit many of the ideal characteristics. In addition they offer high sensitivity as well as promoting the possible of remote indication or mesdurement. Electrical transducers can be divided into two distinct groups:a) variable-control-parameter types,which include:i)resistanceii) capacitanceiii) inductanceiv) mutual-inductance typesThese transducers all rely on external excitation voltage for their operation.b) self-generating types,which includei) electromagneticii)thermoelectriciii)photoemissiveiv)piezo-electric typesThese all themselves produce an output voltage in response to the measurand input and their effects are reversible. For example, a piezo-electric transducer normally produces an output voltage in response to the deformation of a crystalline material; however, if an alternating voltage is applied across the material, the transducer exhibits the reversible effect by deforming or vibrating at the frequency of the alternating voltage.5、Resistance TransducersResistance transducers may be divided into two groups, as follows:i) Those which experience a large resistance change, measured by using potential-divider methods. Potentiometers are in this group.ii)Those which experience a small resistance change, measured by bridge-circuit methods. Examples of this group include strain gauges and resistance thermometers.5.1 PotentiometersA linear wire-wound potentiometer consists of a number of turns resistance wire wound around a non-conducting former, together with a wiping contact which travels over the barwires. The construction principles are shown in figure which indicate that the wiperdisplacement can be rotary, translational, or a combination of both to give a helical-type motion. The excitation voltage may be either a.c. or d.c. and the output voltage is proportional to the input motion, provided the measuring device has a resistance which is much greater than the potentiometer resistance.Such potentiometers suffer from the linked problem of resolution and electrical noise. Resolution is defined as the smallest detectable change in input and is dependent on thecross-sectional area of the windings and the area of the sliding contact. The output voltage is thus a serials of steps as the contact moves from one wire to next.Electrical noise may be generated by variation in contact resistance, by mechanical wear due to contact friction, and by contact vibration transmitted from the sensing element. In addition, the motion being measured may experience significant mechanical loading by the inertia and friction of the moving parts of the potentiometer. The wear on the contacting surface limits the life of a potentiometer to a finite number of full strokes or rotations usually referred to in the manufacture’s specification as the ‘number of cycles of life expectancy’, a typical value being 20*1000000 cycles.The output voltage V0 of the unload potentiometer circuit is determined as follows. Let resistance R1= xi/xt *Rt where xi = input displacement, xt= maximum possible displacement, Rt total resistance of the potentiometer. Then output voltage V0= V*R1/(R1+( Rt-R1))=V*R1/Rt=V*xi/xt*Rt/Rt=V*xi/xt. This shows that there is a straight-line relationship between output voltage and input displacement for the unloaded potentiometer.It would seen that high sensitivity could be achieved simply by increasing the excitation voltage V. however, the maximum value of V is determined by the maximum power dissipation P of the fine wires of the potentiometer winding and is given by V=(PRt)1/2 .5.2 Resistance Strain GaugesResistance strain gauges are transducers which exhibit a change in electrical resistance in response to mechanical strain. They may be of the bonded or unbonded variety .a) bonded strain gaugesUsing an adhesive, these gauges are bonded, or cemented, directly on to the surface of the body or structure which is being examined.Examples of bonded gauges arei) fine wire gauges cemented to paper backingii) photo-etched grids of conducting foil on an epoxy-resin backingiii)a single semiconductor filament mounted on an epoxy-resin backing with copper or nickel leads.Resistance gauges can be made up as single elements to measuring strain in one direction only,or a combination of elements such as rosettes will permit simultaneous measurements in more than one direction.b) unbonded strain gaugesA typical unbonded-strain-gauge arrangement shows fine resistance wires stretched around supports in such a way that the deflection of the cantilever spring system changes the tension in the wires and thus alters the resistance of wire. Such an arrangement may be found in commercially available force, load, or pressure transducers.5.3 Resistance Temperature TransducersThe materials for these can be divided into two main groups:a) metals such as platinum, copper, tungsten, and nickel which exhibit and increase in resistance as the temperature rises; they have a positive temperature coefficient of resistance.b) semiconductors, such as thermistors which use oxides of manganese, cobalt, chromium, or nickel. These exhibit large non-linear resistance changes with temperature variation and normally have a negative temperature coefficient of resistance.a) metal resistance temperature transducersThese depend, for many practical purpose and within a narrow temperature range, upon the relationship R1=R0*[1+a*(b1-b2)] where a coefficient of resistance in ℃-1,and R0 resistance in ohms at the reference temperature b0=0℃ at the reference temperature range ℃.The international practical temperature scale is based on the platinum resistance thermometer, which covers the temperature range -259.35℃ to 630.5℃.b) thermistor resistance temperature transducersThermistors are temperature-sensitive resistors which exhibit large non-liner resistance changes with temperature variation. In general, they have a negative temperature coefficient. For small temperature increments the variation in resistance is reasonably linear; but, if large temperature changes are experienced, special linearizing techniques are used in the measuring circuits to produce a linear relationship of resistance against temperature.Thermistors are normally made in the form of semiconductor discs enclosed in glass vitreous enamel. Since they can be made as small as 1mm,quite rapid response times are possible.5.4 Photoconductive CellsThe photoconductive cell , uses a light-sensitive semiconductor material. The resistance between the metal electrodes decrease as the intensity of the light striking the semiconductor increases. Common semiconductor materials used for photo-conductive cells are cadmium sulphide, lead sulphide, and copper-doped germanium.The useful range of frequencies is determined by material used. Cadmium sulphide is mainly suitable for visible light, whereas lead sulphide has its peak response in the infra-red regionand is, therefore , most suitable for flame-failure detection and temperature measurement. 5.5 Photoemissive CellsWhen light strikes the cathode of the photoemissive cell are given sufficient energy to arrive the cathode. The positive anode attracts these electrons, producing a current which flows through resistor R and resulting in an output voltage V.Photoelectrically generated voltage V=Ip.RlWhere Ip=photoelectric current(A),and photoelectric current Ip=Kt.BWhere Kt=sensitivity (A/im),and B=illumination input (lumen)Although the output voltage does give a good indication of the magnitude of illumination, the cells are more often used for counting or control purpose, where the light striking the cathode can be interrupted.6、Capacitive TransducersThe capacitance can thus made to vary by changing either the relative permittivity, the effective area, or the distance separating the plates. The characteristic curves indicate that variations of area and relative permittivity give a linear relationship only over a small range of spacings. Thus the sensitivity is high for small values of d. Unlike the potentionmeter, the variable-distance capacitive transducer has an infinite resolution making it most suitable for measuring small increments of displacement or quantities which may be changed to produce a displacement.7、Inductive TransducersThe inductance can thus be made to vary by changing the reluctance of the inductive circuit. Measuring techniques used with capacitive and inductive transducers:a)A.C. excited bridges using differential capacitors inductors.b)A.C. potentiometer circuits for dynamic measurements.c) D.C. circuits to give a voltage proportional to velocity for a capacitor.d) Frequency-modulation methods, where the change of C or L varies the frequency of an oscillation circuit.Important features of capacitive and inductive transducers are as follows:i)resolution infiniteii) accuracy+- 0.1% of full scale is quotediii)displacement ranges 25*10-6 m to 10-3miv) rise time less than 50us possibleTypical measurands are displacement, pressure, vibration, sound, and liquid level.8、Linear Variable-differential Ttransformer9、Piezo-electric Transducers10、Electromagnetic Transducers11、Thermoelectric Transducers12、Photoelectric Cells13、Mechanical Transducers and Sensing Elements传感器的基础知识传感器是一种把被测量转换为光的、机械的或者更平常的电信号的装置。

机器人外文翻译(中英文翻译)

机器人外文翻译(中英文翻译)

机器人外文翻译(中英文翻译)机器人外文翻译(中英文翻译)With the rapid development of technology, the use of robots has become increasingly prevalent in various industries. Robots are now commonly employed to perform tasks that are dangerous, repetitive, or require a high level of precision. However, in order for robots to effectively communicate with humans and fulfill their intended functions, accurate translation between different languages is crucial. In this article, we will explore the importance of machine translation in enabling robots to perform translation tasks, as well as discuss current advancements and challenges in this field.1. IntroductionMachine translation refers to the use of computer algorithms to automatically translate text or speech from one language to another. The ultimate goal of machine translation is to produce translations that are as accurate and natural as those generated by human translators. In the context of robots, machine translation plays a vital role in allowing them to understand and respond to human commands, as well as facilitating communication between robots of different origins.2. Advancements in Machine TranslationThe field of machine translation has experienced significant advancements in recent years, thanks to breakthroughs in artificial intelligence and deep learning. These advancements have led to the development of neural machine translation (NMT) systems, which have greatly improved translation quality. NMT models operate by analyzinglarge amounts of bilingual data, allowing them to learn the syntactic and semantic structures of different languages. As a result, NMT systems are capable of providing more accurate translations compared to traditional rule-based or statistical machine translation approaches.3. Challenges in Machine Translation for RobotsAlthough the advancements in machine translation have greatly improved translation quality, there are still challenges that need to be addressed when applying machine translation to robots. One prominent challenge is the variability of language use, including slang, idioms, and cultural references. These nuances can pose difficulties for machine translation systems, as they often require a deep understanding of the context and cultural background. Researchers are currently working on developing techniques to enhance the ability of machine translation systems to handle such linguistic variations.Another challenge is the real-time requirement of translation in a robotic setting. Robots often need to process and translate information on the fly, and any delay in translation can affect the overall performance and efficiency of the robot. Optimizing translation speed without sacrificing translation quality is an ongoing challenge for researchers in the field.4. Applications of Robot TranslationThe ability for robots to translate languages opens up a wide range of applications in various industries. One application is in the field of customer service, where robots can assist customers in multiple languages, providing support and information. Another application is in healthcare settings, where robots can act as interpreters between healthcare professionals and patientswho may speak different languages. Moreover, in international business and diplomacy, robots equipped with translation capabilities can bridge language barriers and facilitate effective communication between parties.5. ConclusionIn conclusion, machine translation plays a crucial role in enabling robots to effectively communicate with humans and fulfill their intended functions. The advancements in neural machine translation have greatly improved translation quality, but challenges such as language variability and real-time translation requirements still exist. With continuous research and innovation, the future of machine translation for robots holds great potential in various industries, revolutionizing the way we communicate and interact with technology.。

外文翻译--机器人技术简介

外文翻译--机器人技术简介

Introduction to robotics technologyIn the manufacturing field, robot development has focused on engineering robotic arms that perform manufacturing processes. In the space industry, robotics focuses on highly specialized, one-of-kind planetary rovers. Unlike a highly automated manufacturing plant, a planetary rover operating on the dark side of the moon -- without radio communication -- might run into unexpected situations. At a minimum, a planetary rover must have some source of sensory input, some way of interpreting that input, and a way of modifying its actions to respond to a changing world. Furthermore, the need to sense and adapt to a partially unknown environment requires intelligence (in other words, artificial intelligence).Mechanical platforms -- the hardware baseA robot consists of two main parts: the robot body and some form of artificial intelligence (AI) system. Many different body parts can be called a robot. Articulated arms are used in welding and painting; gantry and conveyor systems move parts in factories; and giant robotic machines move earth deep inside mines. One of the most interesting aspects of robots in general is their behavior, which requires a form of intelligence. The simplest behavior of a robot is locomotion. Typically, wheels are used as the underlying mechanism to make a robot move from one point to the next. And some force such as electricity is required to make the wheels turn under command.MotorsA variety of electric motors provide power to robots, allowing them to move material, parts, tools, or specialized devices with variousprogrammed motions. The efficiency rating of a motor describes how much of the electricity consumed is converted to mechanical energy. Let's take a look at some of the mechanical devices that are currently being used in modern robotics technology.Driving mechanismsGears and chains:Gears and chains are mechanical platforms that provide a strong and accurate way to transmit rotary motion from one place to another, possibly changing it along the way. The speed change between two gears depends upon the number of teeth on each gear. When a powered gear goes through a full rotation, it pulls the chain by the number of teeth on that gear.Pulleys and belts:Pulleys and belts, two other types of mechanical platforms used in robots, work the same way as gears and chains. Pulleys are wheels with a groove around the edge, and belts are the rubber loops that fit in that groove.Gearboxes:A gearbox operates on the same principles as the gear and chain, without the chain. Gearboxes require closer tolerances, since instead of using a large loose chain to transfer force and adjust for misalignments, the gears mesh directly with each other. Examples of gearboxes can be found on the transmission in a car, the timing mechanism in a grandfather clock, and the paper-feed of your printer.Power suppliesPower supplies are generally provided by two types of battery. Primary batteries are used once and then discarded; secondary batteries operate from a (mostly) reversible chemical reaction and can be recharged several times. Primary batteries have higher density and a lower self-dischargerate. Secondary (rechargeable) batteries have less energy than primary batteries, but can be recharged up to a thousand times depending on their chemistry and environment. Typically the first use of a rechargeable battery gives 4 hours of continuous operation in an application or robot.SensorsRobots react according to a basic temporal measurement, requiring different kinds of sensors.In most systems a sense of time is built-in through the circuits and programming. For this to be productive in practice, a robot has to have perceptual hardware and software, which updates quickly. Regardless of sensor hardware or software, sensing and sensors can be thought of as interacting with external events (in other words, the outside world). The sensor measures some attribute of the world. The term transducer is often used interchangeably with sensor. A transducer is the mechanism, or element, of the sensor that transforms the energy associated with what is being measured into another form of energy. A sensor receives energy and transmits a signal to a display or computer. Sensors use transducers to change the input signal (sound, light, pressure, temperature, etc.) into an analog or digital form capable of being used by a robot.Microcontroller systemsMicrocontrollers (MCUs) are intelligent electronic devices used inside robots. They deliver functions similar to those performed by a microprocessor (central processing unit, or CPU) inside a personal computer. MCUs are slower and can address less memory than CPUs, but are designed for real-world control problems. One of the major differences between CPUs and MCUs is the number of external components needed tooperate them. MCUs can often run with zero external parts, and typically need only an external crystal or oscillator.Utilities and toolsROBOOP (A robotics object oriented package in C++):This package is an object-oriented toolbox in C++ for robotics simulation. Technical references and downloads are provided in the Resources.CORBA: A real-time communications and object request broker software package for embedding distributed software agents. Each independent piece of software registers itself and its capabilities to the ORB, by means of an IDL (Interface Definition Language). Visit their Web site (see Resources) for technical information, downloads, and documentation for CORBA.TANGO/TACO:This software might be useful for controlling a robotics system with multiple devices and tools. TANGO is an object oriented control system based on CORBA. Device servers can be written in C++ or Java. TACO is object oriented because it treats all(physical and logical) control points in a control system as objects in a distributed environment. All actions are implemented in classes. New classes can be constructed out of existing classes in a hierarchical manner, thereby ensuring a high level of software reuse. Classes can be written in C++, in C (using a methodology called Objects in C), in Python or in LabView (using the G programming language).ControllersTask Control Architecture: The Task Control Architecture (TCA) simplifies building task-level control systems for mobile robots. "Task-level" refers to the integration and coordination of perception, planning, andreal time control to achieve a given set of goals (tasks). TCA provides a general control framework, and is intended to control a wide variety of robots. TCA provides a high-level machine-independent method for passing messages between distributed machines (including between Lisp and C processes). TCA provides control functions, such as task decomposition, monitoring, and resource management, that are common to many mobile robot applications. The Resources section provides technical references and download information for Task Control Architecture.EMC (Enhanced Machine Controller): The EMC software is based on the NIST Real time Control System (RCS) methodology, and is programmed using the NIST RCS Library. The RCS Library eases the porting of controller code to a variety of UNIX and Microsoft platforms, providing a neutral application programming interface (API) to operating system resources such as shared memory, semaphores and timers. The EMC software is written in C and C++, and has been ported to the PC Linux, Windows NT, and Sun Solaris operating systems.Darwin2K: Darwin2K is a free, open source toolkit for robot simulation and automated design. It features numerous simulation capabilities and an evolutionary algorithm capable of automatically synthesizing and optimizing robot designs to meet task-specific performance objectives.LanguagesRoboML (Robotic Markup Language): RoboML is used for standardized representation of robotics-related data. It is designed to support communication language between human-robot interface agents, as well as between robot-hosted processes and between interface processes, and to provide a format for archived data used by human-robot interface agents.ROSSUM: A programming and simulation environment for mobile robots. The Rossum Project is an attempt to help collect, develop, and distribute software for robotics applications. The Rossum Project hopes to extend the same kind of collaboration to the development of robotic software.XRCL (Extensible Robot Control Language): XRCL (pronounced zircle) is a relatively simple, modern language and environment designed to allow robotics researchers to share ideas by sharing code. It is an open source project, protected by the GNU Copyleft.SummaryThe field of robotics has created a large class of robots with basic physical and navigational competencies. At the same time, society has begun to move towards incorporating robots into everyday life, from entertainment to health care. Moreover, robots could free a large number of people from hazardous situations, essentially allowing them to be used as replacements for human beings. Many of the applications being pursued by AI robotics researchers are already fulfilling that potential. In addition, robots can be used for more commonplace tasks such as janitorial work. Whereas robots were initially developed for dirty, dull, and dangerous applications, they are now being considered as personal assistants. Regardless of application, robots will require more rather than less intelligence, and will thereby have a significant impact on our society in the future as technology expands to new horizons.外文出处:Robotic technology / edited by A. Pugh./P. Peregrinus, c1993.附件1:外文资料翻译译文机器人技术简介在制造业领域,机器人的开发集中在执行制造过程的工程机器人手臂上。

工业机器人的介绍外文文献翻译、中英文翻译、外文翻译

工业机器人的介绍外文文献翻译、中英文翻译、外文翻译

外文原文Introduction to Industrial RobotsIndustrial robets became a reality in the early 1960’s when Joseph Engelberger and George Devol teamed up to form a robotics company they called “Unimation”.Engelberger and Devol were not the first to dream of machines that could perform the unskilled, repetitive jobs in manufacturing. The first use of the word “robots” was by the Czechoslovakian philosopher and playwright Karel Capek in his play R.U.R.(Rossum’s Universal Robot). The word “robot” in Czech means “worker” or “slave.” The play was written in 1922.In Capek’s play , Rossum and his son discover the chemical formula for artificial protoplasm. Protoplasm forms the very basis of life.With their compound,Rossum and his son set out to make a robot.Rossum and his son spend 20 years forming the protoplasm into a robot. After 20 years the Rossums look at what they have created and say, “It’s absurd to spend twenty years making a man if we can’t make him quicker than nature, you might as w ell shut up shop.”The young Rossum goes back to work eliminating organs he considers unnecessary for the ideal worker. The young Rossum says, “A man is something that feels happy , plays piano ,likes going for a walk, and in fact wants to do a whole lot of things that are unnecessary … but a working machine must not play piano, must not feel happy, must not do a whole lot of other things. Everything that doesn’t contribute directly to the progress of work should be eliminated.”A half century later, engi neers began building Rossum’s robot, not out of artificial protoplasm, but of silicon, hydraulics, pneumatics, and electric motors. Robots that were dreamed of by Capek in 1922, that work but do not feel, that perform unhuman or subhuman, jobs in manufacturing plants, are available and are in operation around the world.The modern robot lacks feeling and emotions just as Rossum’s son thought it should. It can only respond to simple “yes/no” questions. The moderrn robot is normally bolted to the floor. It has one arm and one hand. It is deaf, blind, and dumb. In spite of all of these handicaps, the modern robot performs its assigned task hour after hour without boredom or complaint.A robot is not simply another automated machine. Automation began during the industrial revolution with machines that performed jobs that formerly had been done by human workers. Such a machine, however , can do only the specific job for which it was designed, whereas a robot can perform a variety of jobs.A robot must have an arm. The arm must be able to duplicate the movements of a human worker in loading and unloading other automated machines, spraying paint, welding, and performing hundreds of other jobs that cannot be easily done with conventional automated machines.DEFINITION OF A ROBOTThe Robot Industries Association(RIA) has published a definition for robots in an attempt to clarify which machines are simply automated machines and which machines are truly robots. The RIA definition is as follows:“A robot is a reprogrammabl e multifunctional manipulator designed to move material, parts, tools, or specialized devices through variable programmed motions for the performance of a variety of tasks.”This definition, which is more extensive than the one in the RIA glossary at the end of this book, is an excellent definition of a robot. We will look at this definition, one phrase at a time, so as to understand which machines are in fact robots and which machines are little more than specialized automation.First, a robot is a “reprogrammable multifunctional manipulator.” In this phrase RIA tells us that a robot can be taught (“reprogrammed”) to do more than one job by changing the informaion stored in its memory. A robot can be reprogrammed to load and unload machines, weld, and do ma ny other jobs (“multifunctional”). A robot is a“manipulator”. A manipulator is an arm( or hand ) that can pick up or move things. At this point we know that a robot is an arm that can be taught to do different jobs.The definition goes on to say that a ro bot is “designed to move material, parts, tools, or specialized devices.” Material includes wood,steel, plastic, cardboard… anything that is used in the manufacture of a product.A robot can also handle parts that have been manufactured. For example, a robot can load a piece of steel into an automatic lathe and unload a finished part out of the lathe.In addition to handling material and parts, a robot can be fitted with tools such as grinders, buffers, screwdrivers, and welding torches to perform useful work.Robots can also be fitted with specialized instruments or devices to do special jobs in a manufacturing plant. Robots can be fitted with television cameras for inspection of parts or products. They can be fitted with lasers to accurately mearure the size of parts being manufactured.The RIA definition closes with the phrase,”…through variable programmed motions for the performance of a variety of tasks.” This phrase emphasizes the fact that a robot can do many different jobs in a manufacturing plant. The variety of jobs that a robot can do is limited only by the creativity of the application engineer.JOBS FOR ROBOTSJobs performed by robots can be divided into two major categories:hazardous jobs and repetitive jobs.Hazardous JobsMany applications of robots are in jobs that are hazardous to humans. Such jobs may be considered hazardous because of toxic fumes, the weight of the material being handled, the temperature of the material being handled, the danger of working near rotating or press machinery, or environments containing high levels of radiation. Repetitive JobsIn addition to taking over hazardous jobs, robots are well suited to doingextremely repetitive jobs that must be done in manufacturing plants.many jobs in manufacturing plants require a person to act more like a machine than like a human. The job may be to pick a piece up from here and place it there. The same job is done hundreds of times each day. The job requires little or no judgment and little or no skill. This is not said as a criticism of the person who does the job , but is intended simply to point out that many of these jobs exist in industry and must be done to complete the manufacture of products. A robot can be placed at such a work station and can perform the job admirably without complaining or experiencing the fatigue and boredom normally associated with such a job.Although robots eliminate some jobs in industry, they normally eliminate jobs that humans should never have been asked to do. Machines should perform as machines doing machine jobs, and humans should be placed in jobs that require the use of their ability,creativity, and special skills.POTENTIAL FOR INCREASED PRODUCTIVITYIn addition to removing people from jobs they should not have been placed in, robots offer companies the opportunity of achieving increased productivity. When robots are placed in repetitive jobs they continue to operate at their programmed pace without fatigue. Robots do not take either scheduled or unscheduled breaks from the job. The increase in productivity can result in at least 25% more good parts being produced in an eight-hour shift. This increase in productivity increases the company's profits, which can be reinvested in additional plants and equipment. This increase in productivity results in more jobs in other departments in the plant. With more parts being produced, additional people are needed to deliver the raw materials to the plant, to complete the assembly of the finished products, to sell the finished products, and to deliver the products to their destinations.ROBOT SPEEDAlthough robots increase productivity in a manufacturing plant, they are notexceptionally fast. At present, robots normally operate at or near the speed of a human operator. Every major move of a robot normally takes approximately one second. For a robot to pick up a piece of steel from a conveyor and load it into a lathe may require ten different moves taking as much as ten seconds. A human operator can do the same job in the same amount of time . The increase in productivity is a result of the consistency of operation. As the human operator repeats the same job over and over during the workday, he or she begins to slow down. The robot continues to operate at its programmed speed and therefore completes more parts during the workday.Custom-built automated machines can be built to do the same jobs that robots do. An automated machine can do the same loading operation in less than half the time required by a robot or a human operator. The problem with designing a special machine is that such a machine can perform only the specific job for which it was built. If any change is made in the job, the machine must be completely rebuilt, or the machine must be scrapped and a new machine designed and built. A robot, on the other hand, could be reprogrammed and could start doing the new job the same day.Custom-built automated machines still have their place in industry. If a company knows that a job will not change for many years, the faster custom-built machine is still a good choice.Other jobs in factories cannot be done easily with custom-built machinery. For these applications a robot may be a good choice. An example of such an application is spray painting. One company made cabinets for the electronics industry. They made cabinets of many different sizes, all of which needed painting. It was determined that it was not economical for the company to build special spray painting machines for each of the different sizes of enclosures that were being built. Until robots were developed, the company had no choice but to spray the various enclosures by hand.Spray painting is a hazardous job , because the fumes from many paints are both toxic and explosive. A robot is now doing the job of spraying paint on the enclosures.A robot has been “taught” to spray all the different sizes of enclosures that the company builds. In addition, the robot can operate in the toxic environment of the spray booth without any concern for the long-term effect the fumes might have on aperson working in the booth.FLEXIBLE AUTOMATIONRobots have another advantage: they can be taught to do different jobs in the manufacturing plant. If a robot was originally purchased to load and unload a punch press and the job is no longer needed due to a change in product design, the robot can be moved to another job in the plant. For example, the robot could be moved to the end of the assembly operation and be used to unload the finished enclosures from a conveyor and load them onto a pallet for shipment.ACCURACY AND REPEATABILITYOne very important characteristic of any robot is the accuracy with which it can perform its task. When the robot is programmed to perform a specific task, it is led to specific points and programmed to remember the locations of those points. After programming has been completed, the robot is switched to “run” and the program is executed. Unfortunately, the robot will not go to the exact location of any programmed point. For example, the robot may miss the exact point by 0.025 in. If 0.025 in. is the greatest error by which the robot misses any point- during the first execution of the program, the robot is said to have an accuracy of 0.025 in.In addition to accuracy , we are also concerned with the robot’s repeatability. The repeatability of a robot is a measure of how closely it returns to its programmed points every time the program is executed. Say , for example, that the robot misses a programmed point by 0.025 in. the first time the program is executed and that, during the next execution of the program, the robot misses the point it reached during the previous cycle by 0.010 in. Although the robot is a total of 0.035 in. from the original programmed point, its accuracy is 0.025 in. and its repeatability is 0.010 in.THE MAJOR PARTS OF A ROBOTThe major parts of a robot are the manipulator, the power supply, and the controller.The manipulator is used to pick up material, parts, or special tools used in manufacturing. The power supply suppplies the power to move the manipulator. The controller controls the power supply so that the manipulator can be taught to perform its task.外文翻译工业机器人的介绍20世纪60年代当约瑟夫和乔治合作创立了名为Unimation的机器公司,工业机器人便成为了一个事实。

空间机器人中英文对照外文翻译文献

空间机器人中英文对照外文翻译文献

中英文翻译(文档含英文原文和中文翻译)外文文献:Space Robot Path Planningfor Collision AvoidanceAbstract — This paper deals with a path planning of space robot which includes a collision avoidance algorithm. For the future space robot operation, autonomous and self-contained path planning is mandatory to capture a target without the aid of ground station. Especially the collision avoidance with target itself must be always considered. Once the location, shape and grasp point of the target are identified, those will be expressed in the configuration space. And in this paper a potential method.Laplace potential function is applied to obtain the path in the configuration space in order to avoid so-called deadlock phenomenon. Improvement on the generation of the path has been observed by applying path smoothing method, which utilizes the spline function interpolation. This reduces the computational load and generates the smooth path of the space robot. The validity of this approach is shown by a few numerical simulations.Key Words—Space Robot, Path Planning, Collision Avoidance, Potential Function, Spline InterpolationI. INTRODUCTIONIn the future space development, the space robot and its autonomy will be key features of thespace technology. The space robot will play roles to construct space structures and perform inspections and maintenance of spacecrafts. These operations are expected to be performed in an autonomous.In the above space robot operations, a basic and important task is to capture free flying targets on orbit by the robotic arm. For the safe capturing operation, it will be required to move the arm from initial posture to final posture without collisions with the target.The configuration space and artificial potential methods are often applied to the operation planning of the usual robot. This enables the robot arm to evade the obstacle and to move toward the target. Khatib proposed a motion planning method, in which between each link of the robot and the obstacle the repulsive potential is defined and between the end-effecter of the robot and the goal the attractive potential is defined and by summing both of the potentials and using the gradient of this potential field the path is generated. This method is advantageous by its simplicity and applicability for real-time operation. However there might be points at which the repulsive force and the attractive force are equal and this will lead to the so-called deadlock.In order to resolve the above issue, a few methods are proposed where the solution of Laplace equation is utilized. This method assures the potential fields without the local minimum, i.e., no deadlock. In this method by numerical computation Laplace equation will be solved and generates potential field. The potential field is divided into small cells and on each node the discrete value of the potential will be specified.In this paper for the elimination of the above defects, spline interpolation technique is proposed. The nodal point which is given as a point of path will be defined to be a part of smoothed spline function. And numerical simulations are conducted for the path planning of the space robot to capture the target, in which the potential by solving the Laplace equation is applied and generates the smooth and continuous path by the spline interpolation from the initial to the final posture.II. ROBOT MODELThe model of space robot is illustrated in Fig.1.The robot is mounted on a spacecraft and has two rotary joints which allow the in-plane motion of the end-effecter. In this case we have an additional freedom of the spacecraft attitude angle and this will be considered the additional rotary joint. This means that the space robot isthree linked with 3 DOF (Degree Of Freedom). The length of each link and the angle of each rotary joint are given by i l and i (i = 1,2,3) , respectively. In order to simplify the discussions a few assumptions are made in this paper:-the motion of the space robot is in-plane,i.e., two dimensional one.-effect of robot arm motion to the spacecraft attitude is negligible.-robot motion is given by the relation of static geometry and not explicitly depending on time. -the target satellite is inertially stabilized.In general in-plane motion and out-of-plane motion will be separately performed. So we are able to assume the above first one without loss of generality. The second assumption derives from the comparison of the ratio of mass between the robot arm and the spacecraft body. With respect to the third assumption we focus on generating the path planning of the robot and this is basically given by the static nature of geometry relationship and is therefore not depending on the time explicitly. The last one means the satellite is cooperative.Fig.1 Model of Two-link Space RobotIII. PATH PLANNING GALGORITHMA. Laplace Potential GuidanceThe solution of the Laplace equation (1) is called a Harmonic potential function, and its and minimum values take place only on the boundary. In the robot path generation the boundary means obstacle and goal. Therefore inside the region where the potential is defined, no local minimum takes place except the goal. This eliminates the deadlock phenomenon for path generation.2n22i 1i 0x =∂∅∅==∂∑∇ (1) The Laplace equation can be solved numerically. We define two dimensional Laplace equation as below:22220x y∂∅∂∅+=∂∂ (2) And this will be converted into the difference equation and then solved by Gauss -Seidel method. In equation (2) if we take the central difference formula for second derivatives, the following equation will be obtained:2222220x y(x x,y )2(x,y )(x x,y )x(x,y y )2(x,y )(x,y y )y ∆∆∂∅∂∅+=∂∂∅+∆-∅+∅-∆⇒∅+∆-∅+∅-∆+ (3) where x ∆,y ∆ are the step (cell) sizes between adjacent nodes for each x , y direction. If the step size is assumed equal and the following notation is used:i 1,j (x x,y )+∅+∆=∅Then equation (3) is expressed in the following manner:1,1,,1,1,0i j i j i j i j i j +-+-∅+∅+∅+∅-4∅= (4)And as a result, two dimensional Laplace equation will be converted into the equation (5) as below:i,j i 1,j i 1,j i,j 1i,j 114+-+-∅=(∅+∅+∅+∅) (5) In the same manner as in the three dimensional case, the difference equation for the three dimensional Laplace equation will be easily obtained by the following:i,j ,k i 1,j ,k i 1,j ,k i,j 1,k i,j 1,k i,j ,k 1i,j ,k 116+-+-+-∅=(∅+∅+∅+∅+∅+∅) (6) In order to solve the above equations we apply Gauss-Seidel method and have equations as follows:n 1n n 1n n 1i,j i 1,j i 1,j i,j 1i,j 114++++-+-∅=(∅+∅+∅+∅) (7)where 1,n i j +∅ is the computational result from the ( n +1 )-th iterative calculations of the potential.In the above computations, as the boundary conditions, a certain positive number 0∅ is defined for the obstacle and 0 for the goal. And as the initial conditions the same number 0∅ is also given for all of the free nodes. By this approach during iterative computations the value of the boundary nodes will not change and the values only for free nodes will be varying. Applying the same potential values as the obstacle and in accordance with the iterative computational process, the small potential around the goal will be gradually propagating like surrounding the obstacle. The potential field will be built based on the above procedure.Using the above potential field from 4 nodal points adjacent to the node on which the space robot exists, the smallest node is selected for the point to move to. This procedure finally leads the space robot to the goal without collision.B . Spline InterpolationThe path given by the above approach does not assure the smoothly connected one. And if the goal is not given on the nodal point, we have to partition the cells into much more smaller cells. This will increase the computational load and time.In order to eliminate the above drawbacks we propose the utilization of spline interpolation technique. By assigning the nodal points given by the solution to via points on the path, we try to obtain the smoothly connected path with accurate initial and final points.In this paper the cubic spline was applied by using MATLAB command.C. Configuration SpaceWhen we apply the Laplace potential, the path search is assured only in the case where the robot is expressed to be a point in the searching space. The configuration space(C-Space), where the robot is expressed as a point, is used for the path search. To convert the real space into the C-Space the calculation to judge the condition of collision is performed and if the collision exists, the corresponding point in the C-space is regarded as the obstacle. In this paper when the potential field was generated, the conditions of all the points in the real space, corresponding to all the nodes, were calculated. The judgment of intersection between a segment constituting the robot arm and a segment constituting the obstacle at each node was made and if the intersection takes place, this node is treated as the obstacle in the C-Space.IV.NUMERICAL SIMULATIONSBased on the above approach the path planning for capturing a target satellite was examined using a space robot model. In this paper we assume the space robot with two dimensional and 2 DOF robotic arm as shown in Fig.1.The length of each link is given as follows:l 1 =1.4[m ], l 2 = 2.0[m ], l 3 = 2.0[m ] ,and the target satellite was assumed 1m square. The grasp handle, 0.1 m square, was located at a center of one side of the target. So this handle is a goal of the path.Let us explain the geometrical relation between the space robot and the target satellite. When we consider the operation after capturing the target, it is desirable for the space robot to have the large manipulability. Therefore in this paper the end-effecter will reach the target when the manipulability is maximized. In the 3DOF case, not depending on the spacecraft body attitude, the manipulability is measured by 2,3θθ. And if we assume the end-effector of the space robot should be vertical to the target, then all of the joints angles are predetermined as follows:123160.7,32.8,76.5o o o θθθ===As all the joints angles are determined, the relative position between the spacecraft and the target is also decided uniquely. If the spacecraft is assumed to locate at the origin of the inertial frame (0, 0), the goal is given by (-3.27, -2.00) in the above case. Based on these preparations, we can search the path to the goal by moving the arm in the configuration space.Two simulations for path planning were carried out and the results are shown below.A. 2 DOF RobotIn order to simplify the situation, the attitude angle(Link 1 joint angle) is assumed to coincide with the desirable angle from the beginning. The coordinate system was assumed as shown in Fig.2.1θ was taken into consideration for the calculation of the initial condition of the Link 2 and its goal angles:Innitial condition:2364.3,90o θθο=-=Goal condition: 23166.5,76.5o θθο=-=In this case the potential field was computed for the C-Space with 180 segments. Fig.3 shows the C-Space and the hatched large portion in the center is given by the obstacle mapped by the spacecraft body. The left side portion is a mapping of the target satellite. Fig.4 shows a generated path and this was spline-interpolated curve by using alternate points of discrete data for smoothing .Fig.3 2 DOF C-SpaceFig.4 Path in C-Space(2 DOF)When we consider the rotation of spacecraft body, -180 degrees are equal to +180 degrees and, then, the state over -180 degrees will be started from +180 degrees and again back to the C-Space. For this reason the periodic boundary condition was applied in order to assure the continuity of the rotation. For the simplicity to look at the path, the mapped volume by the spacecraft body was omitted. Also for the simplicity of the path expression the chart which hasdirection was illustrated. From this figure it is easily the connection of -180 degrees in the1seen that over -180 degrees the path is going toward the goal C. B and C are the same goal point.V. CONCLUSIONIn this paper a path generation method for capturing a target satellite was proposed. And its applicability was demonstrated by numerical simulations. By using interpolation technique the computational load will be decreased and smoothed path will be available. Further research will be recommended to incorporate the attitude motion of the spacecraft body affected by arm motion.中文译文:空间机器人避碰路径规划摘要:本文论述的是空间机器人路径规划,这种规划主要运用的是避碰算法。

机器人技术发展趋势论文中英文对照资料外文翻译文献

机器人技术发展趋势论文中英文对照资料外文翻译文献

中英文对照资料外文翻译文献机器人技术发展趋势谈到机器人,现实仍落后于科幻小说。

但是,仅仅因为机器人在过去的几十年没有实现它们的承诺,并不意味着机器人的时代不会到来,或早或晚。

事实上,多种先进技术的影响已经使得机器人的时代变得更近——更小、更便宜、更实用和更具成本效益。

肌肉、骨骼和大脑任何一个机器人都有三方面:·肌肉——有效联系有关物理荷载以便于机器人运动。

·骨骼——一个机器人的物理结构取决于它所做的工作;它的尺寸大小和重量则取决于它的物理荷载。

·大脑——机器人智能;它能独立思考和做什么;需要多少人工互动。

由于机器人在科幻世界中所被描绘过的方式,很多人希望机器人在外型上与人类相似。

但事实上,机器人的外形更多地取决于它所做的工作或具备的功能。

很多一点儿也不像人的机器也被清楚地归为机器人。

同样,很多看起来像人的机器却还是仅仅属于机械结构和玩具。

很多早期的机器人是除了有很大力气而毫无其他功能的大型机器。

老式的液压动力机器人已经被用来执行3-D任务即平淡、肮脏和危险的任务。

由于第一产业技术的进步,完全彻底地改进了机器人的性能、业绩和战略利益。

比如,20世纪80年代,机器人开始从液压动力转换成为电动单位。

精度和性能也提高了。

工业机器人已经在工作时至今日,全世界机器人的数量已经接近100万,其中超过半数的机器人在日本,而仅仅只有15%在美国。

几十年前,90%的机器人是服务于汽车生产行业,通常用于做大量重复的工作。

现在,只有50%的机器人用于汽车制造业,而另一半分布于工厂、实验室、仓库、发电站、医院和其他的行业。

机器人用于产品装配、危险物品处理、油漆喷雾、抛光、产品的检验。

用于清洗下水道,探测炸弹和执行复杂手术的各种任务的机器人数量正在稳步增加,在未来几年内将继续增长。

机器人智能即使是原始的智力,机器人已经被证明了在生产力、效率和质量方面都能够创造良好的效益。

除此之外,一些“最聪明的”机器人没有用于制造业;它们被用于太空探险、外科手术遥控,甚至于宠物,比如索尼的AIBO电子狗。

关于现代工业机械手外文文献翻译@中英文翻译@外文翻译

关于现代工业机械手外文文献翻译@中英文翻译@外文翻译

附录About Modenr Industrial Manipulayor Robot is a type of mechantronics equipment which synthesizes the last research achievement of engine and precision engine, micro-electronics and computer, automation control and drive, sensor and message dispose and artificial intelligence and so on. With the development of economic and the demand for automation control, robot technology is developed quickly and all types of the robots products are come into being. The practicality use of robot not only solves the problems which are difficult to operate for human being, but also advances the industrial automation program. Modern industrial robots are true marvels of engineering. A robot the size of a person can easily carry a load over one hundred pounds and move it very quickly with a repeatability of 0.006inches. Furthermore these robots can do that 24hours a day for years on end with no failures whatsoever. Though they are reprogrammable, in many applications they are programmed once and then repeat that exact same task for years.At present, the research and development of robot involves several kinds of technology and the robot system configuration is so complex that the cost at large is high which to a certain extent limit the robot abroad use. To development economic practicality and high reliability robot system will be value to robot social application and economy development. With he rapid progress with the control economy and expanding of the modern cities, the let of sewage is increasing quickly; with the development of modern technology and the enhancement of consciousness about environment reserve, more and more people realizedthe importance and urgent of sewage disposal. Active bacteria method is an effective technique for sewage disposal. The abundance requirement for lacunaris plastic makes it is a consequent for plastic producing with automation and high productivity. Therefore, it is very necessary to design a manipulator that can automatically fulfill the plastic holding. With the analysis of the problems in the design of the plasticholding manipulator and synthesizing the robot research and development condition in recent years, a economic scheme is concluded on the basis of the analysis of mechanical configuration, transform system, drive device and control system and guided by the idea of the characteristic and complex of mechanical configuration, electronic, software and hardware. In this article, the mechanical configuration combines the character of direction coordinate which can improve the stability and operation flexibility of the system. The main function of the transmission mechanism is to transmit power to implement department and complete the necessary movement. In this transmission structure, the screw transmission mechanism transmits the rotary motion into linear motion. Worm gear can give vary transmission ratio. Both of the transmission mechanisms have a characteristic of compact structure. The design of drive system often is limited by the environment condition and the factor of cost and technical lever. The step motor can receive digital signal directly and has the ability to response outer environment immediately and has no accumulation error, which often is used in driving system. In this driving system, open-loop control system is composed of stepping motor, which can satisfy the demand not only for control precision but also for the target of economic and practicality. On this basis, the analysis of stepping motor in power calculating and style selecting is also given. The analysis of kinematics anddynamics for object holding manipulator is given in completing the design of mechanical structure and drive system.Current industrial approaches to robot arm control treat each joint of the robot arm as a simple joint servomechanism. The servomechanism approach models the varying dynamics of a manipulator inadequately because it neglects the motion and configuration of the whole arm mechanism. These changes in the parameters of the controlled system sometimes are significant enough to render conventional feedback control strategies ineffective. The result is reduced servo response speed and damping, limiting the precision and speed of the end-effecter and making it appropriate only for limited-precision tasks. Manipulators controlled in this manner move at slow speeds with unnecessary vibrations. Any significant performance gain in this and other areas of robot arm control require the consideration of more efficient dynamic models, sophisticated control approaches, and the use of dedicated computer architectures and parallel processing techniques.In the industrial production and other fields, people often endangered by such factors as high temperature, corrode, poisonous gas and so forth at work, which have increased labor intensity and even jeopardized the life sometimes. The corresponding problems are solved since the robot arm comes out. The arms can catch, put and carry objects, and its movements are flexible and diversified. It applies to medium and small-scale automated production in which production varieties can be switched. And it is widely used on soft automatic line. The robot arms are generally made by withstand high temperatures, resist corrosion of materials to adapt to the harsh environment. So they reduced the labor intensity of the workers significantly and raised work efficiency. The robot arm is an importantcomponent of industrial robot, and it can be called industrial robots on many occasions. Industrial robot is set machinery, electronics, control, computers, sensors, artificial intelligence and other advanced technologies in the integration of multidisciplinary important modern manufacturing equipment. Widely using industrial robots, not only can improve product quality and production, but also is of great significance for physical security protection, improvement of the environment for labor, reducing labor intensity, improvement of labor productivity, raw material consumption savings and lowering production costs.There are such mechanical components as ball footbridge, slides, air control mechanical hand and so on in the design. A programmable controller, a programming device, stepping motors, stepping motors drives, direct current motors, sensors, switch power supply, an electromagnetism valve and control desk are used in electrical connection.Robot is the automated production of a kind used in the process of crawling and movin g piece features automatic device, which is mechanized and automated production process d eveloped a new type of device. In recent years, as electronic technology, especially compute r extensive use of robot development and production of hightech fields has become a rapidl y developed a new technology, which further promoted the development of robot, allowing robot to better achieved with the combination of mechanization and automation. Robot can replace humans completed the risk of duplication of boring work, to reduce human labor int ensity and improve labor productivity. Manipulator has been applied more and more widely, in the machinery industry, it can be used for parts assembly, work piece handling, loading a nd unloading, particularly in the automation of CNC machine tools, modular machine toolsmore commonly used. At present, the robot has developed into a FMS flexible manufacturin g systems and flexible manufacturing cell in an important component of the FMC. The mac hine tool equipment and machinery in hand together constitute a flexible manufacturing syst em or a flexible manufacturing cell, it was adapted to small and medium volume production , you can save a huge amount of the work piece conveyor device, compact, and adaptable. When the work piece changes, flexible production system is very easy to change will help e nterprises to continuously update the marketable variety, improve product quality, and better adapt to market competition. At present, China's industrial robot technology and its enginee ring application level and comparable to foreign countries there is a certain distance, applica tion and industrialization of the size of the low level of robot research and development of a direct impact on raising the level of automation in China, from the economy, technical cons iderations are very necessary. Therefore, the study of mechanical hand design is very meani ngful.关于现代工业机械手机器人是典型的机电一体化装置,它综合运用了机械与精密机械、微电子与计算机、自动控制与驱动、传感器与信息处理以及人工智能等多学科的最新研究成果,随着经济技术的开展和各行各业对自动化程度要求的提高,机器人技术得到了迅速开展,出现了各种各样的机器人产品。

传感器技术外文文献及中文翻译

传感器技术外文文献及中文翻译

Sensor technologyA sensor is a device which produces a signal in response to its detecting or measuring a property ,such as position , force , torque ,pressure , temperature ,humidity , speed ,acceleration ,or vibration 。

Traditionally ,sensors (such as actuators and switches )have been used to set limits on the performance of machines .Common examples are (a)stops on machine tools to restrict work table movements ,(b) pressure and temperature gages with automatics shut-off features ,and (c)governors on engines to prevent excessive speed of operation . Sensor technology has become an important aspect of manufacturing processes and systems 。

It is essential for proper data acquisition and for the monitoring ,communication ,and computer control of machines and systems 。

Because they convert one quantity to another , sensors often are referred to as transducers .Analog sensors produce a signal , such as voltage ,which is proportional to the measured quantity .Digital sensors have numeric or digital outputs that can be transferred to computers directly 。

机器人英文翻译原文

机器人英文翻译原文

南京理工大学紫金学院毕业设计(论文)外文资料翻译系:机械工程系专业:机械工程与自动化姓名:学号:外文出处:Robot Journal of Manufacturing(用外文写)Science and Engineering附件:1.外文资料翻译译文;2.外文原文。

附件1:外文资料翻译译文附件2:外文原文RobotJournal of ManufacturingScience and EngineeringRobot is a type of mechantronics equipment which synthesizes the lastresearch achievement of engine and precision engine, micro-electronicsand computer, automation control and drive, sensor and message disposeand artificial intelligence and so on. With the development of economicand the demand for automation control, robot technology is developedquickly and all types of the robots products are come into being. Thepracticality use of robot products not only solves the problems which aredifficult to operate for human being, but also advances the industrialautomation program. Modern industrial robots are true marvels ofengineering. A robot the size of a person can easily carry a load overone hundred pounds and move it very quickly with a repeatability of+/-0.006 inches. Furthermore these robots can do that 24 hours a day foryears on end with no failures whatsoever. Though they are reprogrammable,in many applications (particularly those in the auto industry) they areprogrammed once and then repeat that exact same task for years.At present, the research and development of robot involves severalkinds of technology and the robot system configuration is so complex thatthe cost at large is high which to a certain extent limit the robot abroaduse. To development economic practicality and high reliability robotsystem will be value to robot social application and economy development.With the rapid progress with the control economy and expanding of themodern cities, the let of sewage is increasing quickly: With thedevelopment of modern technology and the enhancement of consciousnessabout environment reserve, more and more people realized the importanceand urgent of sewage disposal. Active bacteria method is an effectivetechnique for sewage disposal,The lacunaris plastic is an effectivebasement for active bacteria adhesion for sewage disposal. The abundancerequirement for lacunaris plastic makes it is a consequent for the plastic producing with automation and high productivity. Therefore, it is very necessary to design a manipulator that can automatically fulfill the plastic holding. With the analysis of the problems in the design of the plastic holding manipulator and synthesizing the robot research and development condition in recent years, a economic scheme is concluded on the basis of the analysis of mechanical configuration, transform system, drive device and control system and guided by the idea of the characteristic and complex of mechanical configuration, electronic, software and hardware. In this article, the mechanical configuration combines the character of direction coordinate and the arthrosis coordinate which can improve the stability and operation flexibility of the system. The main function of the transmission mechanism is to transmit power to implement department and complete the necessary movement. In this transmission structure, the screw transmission mechanism transmits the rotary motion into linear motion. Worm gear can give vary transmission ratio. Both of the transmission mechanisms have a characteristic of compact structure. The design of drive system often is limited by the environment condition and the factor of cost and technical lever. ''''The step motor can receive digital signal directly and has the ability to response outer environment immediately and has no accumulation error, which often is used in driving system. In this driving system, open-loop control system is composed of stepping motor, which can satisfy the demand not only for control precision but also for the target of economic and practicality. On this basis,the analysis of stepping motor in power calculating and style selecting is also given. The analysis of kinematics and dynamics for object holding manipulator is given in completing the design of mechanical structure and drive system. Kinematics analysis is the basis of path programming and track control. The positive and reverseanalysis of manipulator gives the relationship between manipulator space and drive space in position and speed. The relationship between manipulator’s tip position and arthrosis angles is concluded by coordinate transform method. The geometry method is used in solving inverse kinematics problem and the result will provide theory evidence for control system. The f0unction of dynamics is to get the relationship between the movement and force and the target is to satisfy the demand of real time control. in this chamfer, Newton-Euripides method is used in analysis dynamic problem of the cleaning robot and the arthrosis force and torque are given which provide the foundation for step motor selecting and structure dynamic optimal ting. Control system is the key and core part of the object holding manipulator system design which will direct effect the reliability and practicality of the robot system in the division of configuration and control function and also will effect or limit the development cost and cycle. With the demand of the PCL-839 card, the PC computer which has a. tight structure and is easy to be extended is used as the principal computer cell and takes the function of system initialization, data operation and dispose, step motor drive and error diagnose and so on. A t the same time, the configuration structure features, task principles and the position function with high precision of the control card PCL-839 are analyzed. Hardware is the matter foundation of the control. System and the software is the spirit of the control system. The target of the software is to combine all the parts in optimizing style and to improve the efficiency and reliability of the control system. The software design of the object holding manipulator control system is divided into several blocks such as system initialization block, data process block and error station detect and dispose model and so on. PCL-839 card can solve the communication between the main computer and the control cells and take the measure of reducing the influence of the outer signalto the control system. The start and stop frequency of the step motor is far lower than the maximum running frequency. In order to improve the efficiency of the step motor, the increase and decrease of the speed is must considered when the step motor running in high speed and start or stop with great acc eleration. The increase and decrease of the motor’s speed can be controlled by the pulse freque ncy sent to the step motor drive with a rational method. This can be implemented either by hardware or by software. A step motor shift control method is proposed, which is simple to calculate, easy to realize and the theory means is straightforward. The motor'''' s acceleration can fit the torque-frequency curve properly with this method. And the amount of calculation load is less than the linear acceleration shift control method and the method which is based on the exponential rule to change speed. The method is tested by experiment. A t last, the research content and the achievement are sum up and the problems and shortages in main the content are also listed. The development and application of robot in the future is expected.The purpose of manipulator control is to maintain the dynamic response of a computer-based manipulator in accordance with some prespecified system performance and desired goals. In general, the dynamic performance of a manipulator directly depends on the efficiency of the control algorithms and the dynamic model of the manipulator. The control problem consists of obtaining dynamic models of the physical robot arm system and then specifying corresponding control laws or strategies to achieve the desired system response and performance.Current industrial approaches to robot arm control treat each joint of the robot arm as a simple joint servomechanism. The servomechanism approach models the varying dynamics of a manipulator inadequately because it neglects the motion and configuration of the whole arm mechanism. These changes in the parameters of the controlled system sometimes are significant enough to render conventional feedback control strategies ineffective. The result is reduced servo response speed anddamping, limiting the precision and speed of the end-effector and making it appropriate only for limited-precision tasks. Manipulators controlled in this manner move at slow speeds with unnecessary vibrations. Any significant performance gain in this and other areas of robot arm control require the consideration of more efficient dynamic models, sophisticated control approaches, and the use of dedicated computer architectures and parallel processing techniques.In the industrial production and other fields, people often endangered by such factors as high temperature, corrode, poisonous gas and so forth at work, which have increased labor intensity and even jeopardized the life sometimes. The corresponding problems are solved since the robot arm comes out. The robot arms can catch, put and carry objects, and its movements are flexible and diversified. It applies to medium and small-scale automated production in which production varieties can be switched. And it is widely used on soft automatic line. The robot arms are generally made by withstand high temperatures, resist corrosion of materials to adapt to the harsh environment. So they reduced the labor intensity of the workers significantly and raised work efficiency. The robot arm is an important component of industrial robots, and it can be called industrial robots on many occasions. Industrial robot is set machinery, electronics, control, computers, sensors, artificial intelligence and other advanced technologies in the integration of multidisciplinary important modern manufacturing equipment. Widely using industrial robots, not only can improve product quality and production, but also is of great significance for physical security protection, improvement of the environment for labor, reducing labor intensity, improvement of labor productivity, raw material consumption savings and lowering production costs.There are such mechanical components as ball footbridge, slides, an air control mechanical hand and so on in the design. A programmable controller, a programming device, stepping motors, stepping motors drives, direct current motors, sensor, switch power supply, an electromagnetism valve and control desk are used in electrical connection. The programmable controller output two lines pulses to stepping motors drives to drive the two stepping motors drives on beam and verticalaxis; direct current motors drive the rotation of the base and the hand; sensors send signals of location to the mainframe, and the mainframe sends directive to control the extension and contraction, up and down, moves location; the mainframe send signals to control the opening and closing of the hand to carry objects. Related parameters can be changed according to request of the changes of the objects and movement flow at any time change the relevant parameters in the design, so it has great flexibility and operability.。

机器人外文翻译(文献翻译_中英文翻译)

机器人外文翻译(文献翻译_中英文翻译)

外文翻译外文资料:RobotsFirst, I explain the background robots, robot technology development. It should be said it is a common scientific and technological development of a comprehensive results, for the socio-economic development of a significant impact on a science and technology. It attributed the development of all countries in the Second World War to strengthen the economic input on strengthening the country's economic development. But they also demand the development of the productive forces the inevitable result of human development itself is the inevitable result then with the development of humanity, people constantly discuss the natural process, in understanding and reconstructing the natural process, people need to be able to liberate a slave. So this is the slave people to be able to replace the complex and engaged in heavy manual labor, People do not realize right up to the world's understanding and transformation of this technology as well as people in the development process of an objective need. Robots are three stages of development, in other words, we are accustomed to regarding robots are divided into three categories. is a first-generation robots, also known as teach-type robot, it is through a computer, to control over one of a mechanical degrees of freedom Through teaching and information stored procedures, working hours to read out information, and then issued a directive so the robot can repeat according to the people at that time said the results show this kind of movement again, For example, the car spot welding robots, only to put this spot welding process, after teaching, and it is always a repeat of a work It has the external environment is no perception that the force manipulation of the size of the work piece there does not exist, welding 0S It does not know, then this fact from the first generation robot, it will exist this shortcoming, it in the 20th century, the late 1970s, people started to study the second-generation robot, called Robot with the feeling that This feeling with the robot is similar in function of a certain feeling, forinstance, force and touch, slipping, visual, hearing and who is analogous to that with all kinds of feelings, say in a robot grasping objects, In fact, it can be the size of feeling out, it can through visual, to be able to feel and identify its shape, size, color Grasping an egg, it adopted a acumen, aware of its power and the size of the slide. Third-generation robots, we were a robotics ideal pursued by the most advanced stage, called intelligent robots, So long as tell it what to do, not how to tell it to do, it will be able to complete the campaign, thinking and perception of this man-machine communication function and function Well, this current development or relative is in a smart part of the concept and meaning But the real significance of the integrity of this intelligent robot did not actually exist, but as we continued the development of science and technology, the concept of intelligent increasingly rich, it grows ever wider connotations.Now, I would like to briefly outline some of the industrial robot situation. So far, the industrial robot is the most mature and widely used category of a robot, now the world's total sales of 1.1 million Taiwan, which is the 1999 statistics, however, 1.1 million in Taiwan have been using the equipment is 75 million, this volume is not small. Overall, the Japanese industrial robots in this one, is the first of the robots to become the Kingdom, the United States have developed rapidly. Newly installed in several areas of Taiwan, which already exceeds Japan, China has only just begun to enter the stage of industrialization, has developed a variety of industrial robot prototype and small batch has been used in production.Spot welding robot is the auto production line, improve production efficiency and raise the quality of welding car, reduce the labor intensity of a robot. It is characterized by two pairs of robots for spot welding of steel plate, bearing a great need for the welding tongs, general in dozens of kilograms or more, then its speed in meters per second a 5-2 meter of such high-speed movement. So it is generally five to six degrees of freedom, load 30 to 120 kilograms, the great space, probably expected that the work of a spherical space, a high velocity, the concept of freedom, that is to say, Movement is relatively independent of the number of components, the equivalent of our body, waist is a rotary degree of freedom We have to be able to hold his arm, Arm can be bent, then this three degrees of freedom, Meanwhile there is a wristposture adjustment to the use of the three autonomy, the general robot has six degrees of freedom. We will be able to space the three locations, three postures, the robot fully achieved, and of course we have less than six degrees of freedom. Have more than six degrees of freedom robot, in different occasions the need to configure.The second category of service robots, with the development of industrialization, especially in the past decade, Robot development in the areas of application are continuously expanding, and now a very important characteristic, as we all know, Robot has gradually shifted from manufacturing to non-manufacturing and service industries, we are talking about the car manufacturer belonging to the manufacturing industry, However, the services sector including cleaning, refueling, rescue, rescue, relief, etc. These belong to the non-manufacturing industries and service industries, so here is compared with the industrial robot, it is a very important difference. It is primarily a mobile platform, it can move to sports, there are some arms operate, also installed some as a force sensor and visual sensors, ultrasonic ranging sensors, etc. It’s surrounding environment for the conduct of identification, to determine its campaign to complete some work, this is service robot’s one of the basic characteristics.For example, domestic robot is mainly embodied in the example of some of the carpets and flooring it to the regular cleaning and vacuuming. The robot it is very meaningful, it has sensors, it can furniture and people can identify, It automatically according to a law put to the ground under the road all cleaned up. This is also the home of some robot performance.The medical robots, nearly five years of relatively rapid development of new application areas. If people in the course of an operation, doctors surgery, is a fatigue, and the other manually operated accuracy is limited. Some universities in Germany, which, facing the spine, lumbar disc disease, the identification, can automatically use the robot-aided positioning, operation and surgery Like the United States have been more than 1,000 cases of human eyeball robot surgery, the robot, also including remote-controlled approach, the right of such gastrointestinal surgery, we see on the television inside. a manipulator, about the thickness fingers such a manipulator, inserted through the abdominal viscera, people on the screen operating the machines hand, it also used the method of laser lesion laser treatment, this is the case, peoplewould not have a very big damage to the human body.In reality, this right as a human liberation is a very good robots, medical robots it is very complex, while it is fully automated to complete all the work, there are difficulties, and generally are people to participate. This is America, the development of such a surgery Lin Bai an example, through the screen, through a remote control operator to control another manipulator, through the realization of the right abdominal surgery A few years ago our country the exhibition, the United States has been successful in achieving the right to the heart valve surgery and bypass surgery. This robot has in the area, caused a great sensation, but also, AESOP's surgical robot, In fact, it through some equipment to some of the lesions inspections, through a manipulator can be achieved on some parts of the operation Also including remotely operated manipulator, and many doctors are able to participate in the robot under surgery Robot doctor to include doctors with pliers, tweezers or a knife to replace the nurses, while lighting automatically to the doctor's movements linked, the doctor hands off, lighting went off, This is very good, a doctor's assistant.Robot is mankind's right-hand man; friendly coexistence can be a reliable friend. In future, we will see and there will be a robot space inside, as a mutual aide and friend. Robots will create the jobs issue. We believe that there would not be a "robot appointment of workers being laid off" situation, because people with the development of society, In fact the people from the heavy physical and dangerous environment liberated, so that people have a better position to work, to create a better spiritual wealth and cultural wealth.译文资料:机器人首先我介绍一下机器人产生的背景,机器人技术的发展,它应该说是一个科学技术发展共同的一个综合性的结果,同时,为社会经济发展产生了一个重大影响的一门科学技术,它的发展归功于在第二次世界大战中各国加强了经济的投入,就加强了本国的经济的发展。

机器人外文翻译外文文献英文文献采用模糊逻辑控制使自主机器人避障设计

机器人外文翻译外文文献英文文献采用模糊逻辑控制使自主机器人避障设计

Autonomous robot obstacle avoidance using a fuzzy logic control schemeWilliam MartinSubmitted on December 4, 2009CS311 - Final Project1. INTRODUCTIONOne of the considerable hurdles to overcome, when trying to describe areal-world control scheme with first-order logic, is the strong ambiguity found in both semantics and evaluations. Although one option is to utilize probability theory in order to come up with a more realistic model, this still relies on obtaining information about an agent's environment with some amount of precision. However, fuzzy logic allows an agent to exploit inexactness in its collected data by allowing for a level of tolerance. This can be especially important when high precision or accuracy in a measurement is quite costly. For example, ultrasonic and infrared range sensors allow for fast and cost effective distance measurements with varying uncertainty. The proposed applications for fuzzy logic range from controlling robotic hands with six degrees of freedom1 to filtering noise from a digital signal.2 Due to its easy implementation, fuzzy logic control has been popular for industrial applications when advanced differential equations become either computationally expensive or offer no known solution. This project is an attempt to take advantage of these fuzzy logic simplifications in order to implement simple obstacle avoidance for a mobile robot. 2. PHYSICAL ROBOT IMPLEMENTATION2.1. Chassis and sensorsThe robotic vehicle's chassis was constructed from an Excalibur EI-MSD2003 remote control toy tank. The device was stripped of all electronics, gears, and extraneous parts in order to work with just the empty case and two DC motors for the tank treads. However, this left a somewhat uneven surface to work on, so high-density polyethylene (HDPE) rods were used to fill in empty spaces. Since HDPE has a rather low surface energy, which is not ideal for bonding with other materials, a propanetorch was used to raise surface temperature and improve bonding with an epoxy adhesive.Three Sharp GP2D12 infrared sensors, which have a range of 10 to 80 cm, were used for distance measurements. In order to mount these appropriately, a 2.5 by 15 cm piece of aluminum was bent into three even pieces at 135 degree angles. This allows for the IR sensors to take three different measurements at 45 degree angles (right, middle, and left distances). This sensor mount was then attached to an HDPE rod with mounting tape and the rod was glued to the tank base with epoxy. Since the minimum distance that can be reliably measured with these sensors is 10 cm, the sensors were placed about 9 cm from the front of the vehicle. This allowed measurements to be taken very close to the front of the robot.2.2. ElectronicsIn order to control the speed of each motor, pulse-width modulation (PWM) was used to drive two L2722 op amps in open loop mode (Fig. 1). The high input resistance of these ICs allow for the motors to be powered with very little power draw from the PWM circuitry. In order to isolate the motor's power supply from the rest of the electronics, a 9.6 V NiCad battery was used separately from a standard 9 V that demand on the op amps led to a small amount of overheating during continuous operation. This was remedied by adding small heat sinks and a fan to the forcibly disperse heat.Fig. 1. The control circuit used for driving each DC motor. Note that the PWM signal was between 0 and 5 V.2.3. MicrocontrollerComputation was handled by an Arduino Duemilanove board with anATmega328 microcontroller. The board has low power requirements and modifications. In addition, it has a large number of prototyping of the control circuit and based on the Wiring language. This board provided an easy and low-cost platform to build the robot around.3. FUZZY CONTROL SCHEME FORIn order to apply fuzzy logic to the robot to interpret measured distances. While the final algorithm depended critically on the geometry of the robot itself and how it operates, some basic guidelines were followed. Similar research projects provided both simulation results and ideas for implementing fuzzy control.3,4,53.1. Membership functionsThree sets of membership functions were created to express degrees of membership for distances, translational speeds, and rotational speeds. This made for a total of two input membership functions and eight output membership functions (Fig.2). Triangle and trapezoidal functions were used exclusively since they are quick to compute and easy to modify. Keeping computation time to a minimum was essential so that many sets of data could be analyzed every second (approximately one every 40 milliseconds). The distance membership functions allowed the distances from the IR sensors to be quickly "fuzzified," while the eight speed membership functions converted fuzzy values back into crisp values.3.2.Rule baseOnce the input data was fuzzified, the eight defined fuzzy logic rules (Table I) were executed in order to assign fuzzy values for translational speed and rotation. This resulted in multiple values for the each of the fuzzy output components. It was then necessary to take the maximum of these values as the fuzzy value for each component. Finally, these fuzzy output values were "defuzzified" using themax-product technique and the result was used to update each of the motor speeds.(a)(b)(c)rotational speed. These functions were adapted from similar work done in reference 3.4. RESULTSThe fuzzy control scheme allowed for the robot to quickly respond to obstacles itcould detect in its environment. This allowed it to follow walls and bend aroundcorners decently without hitting any obstacles. However, since the IR sensors'measurements depended on the geometry of surrounding objects, there were times when the robot could not detect obstacles. For example, when the IR beam hit a surface with oblique incidence, it would reflect away from the sensor and not register as an object. In addition, the limited number of rules used may have limited the dynamics of the robot's responses. Some articles suggest as many as forty rules6 should be used, while others tend to present between ten and twenty. Since this project did not explore complex kinematics or computational simulations of the robot, it is difficult to determineexactly how many rules should be used. However, for the purposes of testing fuzzy logic as a navigational aide, the eight rules were sufficient. Despite the many problems that IR and similar ultrasonic sensors have with reliably obtaining distances, the robustness of fuzzy logic was frequently able to prevent the robot from running into obstacles.5. CONCLUSIONThere are several easy improvements that could be made to future iterations of this project in order to improve the robot's performance. The most dramatic would be to implement the IR or ultrasonic sensors on a servo so that they could each scan a full 180 degrees. However, this type of overhaul may undermine some of fuzzy logic's helpful simplicity. Another helpful tactic would be to use a few types of sensors so that data could be taken at multiple ranges. The IR sensors used in this experiment had a minimum distance of 10 cm, so anything in front of this could not be reliably detected. Similarly, the sensors had a maximum distance of 80 cm so it was difficult to react to objects far away. Ultrasonic sensors do offer significantly increased ranges at a slightly increased cost and response time. Lastly, defining more membership functions could help improve the rule base by creating more fine tuned responses. However, this would again increase the complexity of the system.Thus, this project has successfully implemented a simple fuzzy control scheme for adjusting the heading and speed of a mobile robot. While it is difficult to determine whether this is a worthwhile application without heavily researching other methods, it is quite apparent that fuzzy logic affords a certain level of simplicity in thedesign of a system. Furthermore, it is a novel approach to dealing with high levels of uncertainty in real-world environments.6. REFERENCES1 Ed. M. Jamshidi, N. Vadiee, and T. Ross, Fuzzy logic and control: software and hardware applications, (Prentice Hall: Englewood Cliffs, NJ) 292-328.2 Ibid, 232-261.3 W. L. Xu, S. K. Tso, and Y. H. Fung, "Fuzzy reactive control of a mobile robot incorporating a real/virtual target switching strategy," Robotics and Autonomous Systems, 23(3), 171-186 (1998).4 V. Peri and D. Simon, “Fuzzy logic control for an autonomous robot,” 2005 Annual Meeting of the North American Fuzzy Information Processing Society, 337-342 (2005).5 A. Martinez, E. Tunstel, and M. Jamshidi, "Fuzzy-logic based collision-avoidance for a mobile robot," Robotica, 12(6) 521–527 (1994).6 W. L. Xu, S. K. Tso, and Y. H. Fung, "Fuzzy reactive control of a mobile robot incorporating a real/virtual target switching strategy," Robotics and Autonomous Systems, 23(3), 171-186 (1998).采用模糊逻辑控制使自主机器人避障设计威廉马丁提交于2009年12月4日CS311 -最终项目1 引言其中一个很大的障碍需要克服,当试图用控制逻辑一阶来描述一个真实世界设计在发现在这两个语义评价中是个强大的模糊区。

  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

(文档含英文原文和中文翻译)中英文资料外文翻译文献机器人和机器人传感器介绍工业机器人以及它的运行是本文的主题。

工业机器人是应用于制造环境下以提高生产率的一种工具。

它可用于承担常规的、冗长乏味的装配线工作,或执行那些对工人也许有危害的工作。

例如,在第一代工业机器人中,曾有一台被用于更换核电厂的核燃料棒。

从事这项工作的工人可能会暴露在有害量的放射线下。

工业机器人也能够在装配线上操作——安装小型元件,例如将电子元件安装在线路板上。

为此,工人可以从这种冗长乏味任务的常规操作中解放出来。

通过编程的机器人还能去掉炸弹的雷管、为残疾者服务以及在我们社会的众多应用中发挥作用。

机器人可被看作将臂端执行工具、传感器以及/或夹爪移动到某个预定位置的一台机器。

当机器人到达该位置,它将执行某个任务。

该任务可能是焊接、密封、机械装载、机械卸载,或许多装配工作。

除了编程以及打开和关闭系统之外,一般情况下,均不需要人们的参与就能完成这类工作。

机器人专业术语机器人是一台可再编程的多功能机械手,它可通过可编程运动移动零件、物料、工具或特殊装置以执行某种不同任务。

由这项定义可导致下面段落中被阐述的其他定义,它们为机器人系统提供了完整的写照。

预编程位置是机器人为了完成工作必须遵循和通过的途径。

在这些位置的某点,机器人会停下来并执行某种操作,例如装配零件,喷漆或焊接。

这些预编程位置被存储在机器人的记忆装置中供以后继续操作时使用。

此外,当工作的要求发生变化时,不仅其他编程数据而且这些预编程位置均可作修改。

因此,正由于这种编程的特点,一台工业机器人与一台可存储数据、以及可回忆及编辑的计算机十分相似。

机械手是机器人的手臂,它允许机器人俯仰、伸缩和转动。

这种动作是由机械手的轴所提供的,机械手的轴又称为机器人的自由度。

一台机器人可以具有3至16根轴。

在本人的后面部分,自由度这个术语总与一台机器人轴的数目相关联。

工具及夹爪并非属于机器人系统的本身,它们是装在机器人手臂端部的附件。

有了与机器人手臂端部相连接的这些附件,机器人就可以提起零件、点焊、喷漆、弧焊、钻孔、去毛刺,还可以根据所提要求指向各种类型的任务。

机器人系统还可以控制操作机器人的工作单元。

机器人工作单元是一种总体环境,在该环境下机器人必须执行赋予它的任务。

该单元可包容控制器、机器人的机械手、工作台、安全装置,或输送机。

机器人开展工作所需要的所有设备均被包括在这个工作单元中。

此外,来自外界装置的信号能够与机器人进行交流,这样就可以告诉机器人什么时候它该装配零件、捡起零件或将零件卸到输送机。

基本部件机器人系统具有3个基本部件:机械手、控制器及动力源。

在某些机器人系统中可以看到第4个部件,端部执行件,有关这些部件将在下面小节描述。

机械手机械手承担机器人系统的体力工作,它由两部分组成:机械部分及被连接的附属物。

机械手还有一个与附属物相连的底座。

机械手的底座通常被固定在工作领域的地面。

有时,底座也可以移动。

在该情况下,底座被安装到导轨上,这样该机械手就可以从一处移动到另一处。

例如,一台机器人可以为几台机床工作,为每台机床装载和卸载。

正如前面所述,附属物从机器人的底座伸出。

该附属物是机器人的手臂。

它既可以是一个直线型的可动臂,也可以是一个铰接臂。

铰接臂也称关节臂。

机器人机械手的附属物可为机械手提供各种运动轴。

这些轴与固定底座相连接,而该底座又被紧固到机架上。

这个机架能确保该机械手被维持在某个位置上。

在手臂的端部连接着一个手腕。

该手腕由附加轴及手腕法兰组成,有了该手腕法兰,机器人用户就可以根据不同的工作在手腕上安装不同的工具。

机械手的轴允许机械手在一定区域内执行工作。

如前所述,该区域被称为机器人的工作单元,它的尺度与机械手的尺寸相对应。

当机器人的物理尺寸增大时,工作单元的尺寸必然也随之增加。

机械手的运动由驱动器,或驱动系统所控制。

驱动器或驱动系统允许各根轴在工作单元内运动,驱动系统可利用电力的、液压的或气压动力。

驱动系统发出的能量由各种机械驱动装置转换成机械动力。

这些驱动装置通过机械联动机构接合在一起。

这些联动机构依次驱动机器人的不同轴。

机械联动机构由链轮机构,齿轮机构及滚珠丝杠所组成。

控制器机器人系统的控制器是运行的心脏。

控制器存储着为以后回忆所用的预编程信息,控制着外围设备,它还与厂内计算机进行交流以使生产不断更新。

控制器用于控制机器人机械手运动以及工作单元中的外围部件。

工作人员可以利用手递示教盒将机械手的动作编程进入控制器。

这种信息可被存储在控制器的记忆装置中以便以后回忆使用。

控制器存储着机器人系统的所有程序数据。

它可以存储几种不同的程序,并且它们中任一程序均可被编辑。

也可要求控制器与工作单元中外围设备进行交流。

例如,控制器具有一根输入线,该输入线可识别某项机械加工什么时候完成。

当该机械循环完成时,输入线被接通,它会吩咐控制器让机械手到位以便机械手能夹起以加工完的零件。

接着,该机械手再捡起一根新的零件并将它安放到机床上,然后,控制器向该机床发出信号让它开始运转。

控制器可由机械操纵的磁鼓构成,这些鼓按工作发生的先后次序操作。

这类控制器用于非常简单的机器人系统。

在大多数机器人系统中见到的控制器是很复杂的装置,它们体现了现代化的电子科学。

换言之,它们由微信息处理器操纵。

这些微信息处理器不是8位、16位就是32位的信息处理器。

这种功能使控制器的运行具有非常好的柔性。

控制器可通过通讯线路发出电子信号,发出能与机械手各轴线进行沟通的电信号,机器人机械手与控制器之间这种双向交流可使系统的位置及运行维持在不断修正及更新得状态下,控制器还可以控制安装在机器人手腕端部的任意工具。

控制器还有与工厂中不同计算机开展交流的任务,这个通讯网络可使机器人成为计算机辅助制造(CAM)系统的一部分。

根据上述基本定义,机器人是一台可再编程序的多功能机械手。

所以,控制器必须包含某种形式的记忆存储器,以微信息处理器为基础的系统常与固态记忆装置连同运行。

这些记忆装置可以是磁泡、随机存取记忆装置、软塑料磁盘或磁带。

每种记忆存储装置均可存储编程信息以便以后回忆使用。

动力源动力源是向控制器及机械手供给动力得装置,有两类动力供给机器人系统。

一类动力是供控制器运行的交流点动力,另一类被用于驱动机械手各轴。

例如,若机器人的机械手由液压或气压装置控制,则控制信号被发送到这些装置才能使机器人运动。

每个机器人系统均需要动力来驱动机械手,这种动力既可由液压动力源、气压动力源,也可以由电力动力源提供,这些动力源是机器人工作单元总的部件及设备中的一部分。

当液压动力源与及机器人机械手底座相连接,液压源产生液压流体,这些流体输送到机械手各控制元件,于是,使轴绕机器人底座旋转。

压力空气被输送到机械手,使轴沿轨道作直线运动,也可将这种气动源连接到钻床,它可为钻头的旋转提供动力。

一般情况下,可从工厂得供给站获取气动源并做调整,然后将它输入机器人机械手的轴。

电动机可以是交流式的,也可以是直流式的。

控制器发出的脉冲信号被发送到机械手得电机。

这些脉冲为电机提供必要的指令信息以使机械手在机器人底座上旋转。

用于机械手轴的三种动力系统任一种均需要使用反馈监督系统,这种系统会不断地将每个轴位置数据反馈给控制器。

每种机器人系统不仅需要动力来开动机械手的轴,还需要动力来驱动控制器,这种动力可由制造环境的动力源提供。

端部执行件在大部分机器人应用的场合见到的端部执行件均是机械手手腕法兰相连接的一个装置,端部执行件可应用于生产领域中许多不同场合,例如,它可用于捡起零件,用于焊接,或用于喷漆,端部执行件为机器人系统提供了机器人运行时必须的柔性。

通常所设计得端部执行件可满足机器人用户的需要。

这些部件可由机器人制造商或机器人系统的物主制造。

端部执行件事机器人系统中唯一可将一种工作变成另一种工作的部件,例如,即日起可与喷水割机相连,它在汽车生产线上被用于切割板边。

也可要求机器人将零件安放到磁盘中,在这简单的过程中,改变了机器人端部执行件,该机器人就可以用于其它应用场合,端部执行件得变更以及机器人的再编程序可使该系统具有很高的柔性。

机器人传感器尽管机器人有巨大的能力,但很多时候却比不过没有经过一点训练的工人。

例如,工人们能够发现零件掉在地上或发现进料机上没有零件,但没有了传感器,机器人就得不到这些信息,及时使用最尖端的传感器,机器人也比不上一个经验丰富的工人,因此,一个好的机器人系统的设计需要使用许多传感器与机器人控制器相接,使其尽可能接近操作工人得感知能力。

机器人技术最经常使用的传感器分为接触式的与非接触式的。

接触式传感器可以进一步分为触觉传感器、力和扭矩传感器。

触觉或接触传感器可以测出受动器端与其他物体间的实际接触,微型开关就是一个简单的触觉传感器,当机器人得受动气端与其他物体接触时,传感器是机器人停止工作,避免物体间的碰撞,告诉机器人已到达目标;或者在检测时用来测量物体尺寸。

力和扭矩传感器位于机器人得抓手与手腕的最后一个关节之间,或者放在机械手得承载部件上,测量反力与力矩。

力和扭矩传感器有压电传感器和装在柔性部件上的应变仪等。

非接触传感器包括接近传感器、视觉传感器、声敏元件及范围探测器等。

接近传感器和标示传感器附近的物体。

例如,可以用涡流传感器精确地保持与钢板之间的固定的距离。

最简单的机器人接近传感器包括一个发光二极管发射机和一个光敏二极管接收器,接收反射面移近时的反射光线,这种传感器的主要缺点是移近物对光线的反射率会影响接收信号。

其他得接近传感器使用的是与电容和电感相关的原理。

视觉传感系统十分复杂,基于电视摄像或激光扫描的工作原理。

摄像信号经过硬件预处理,以30帧至60帧每秒的速度输入计算机。

计算机分析数据并提取所需的信息,例如,物体是否存在以及物体的特征、位置、操作方向,或者检测元件的组装及产品是否完成。

声敏元件用来感应并解释声波,从基本的声波探测到人们连续讲话的逐字识别,各种声敏元件的复杂程序不等,除了人机语音交流外,机器人还可以使用声敏元件控制弧焊,听到碰撞或倒塌的声音时阻止机器人的运动,预测将要发生的机械破损及检测物体内部缺陷。

还有一种非接触系统使用投影仪和成像设备获取物体的表面形状信息或距离信息。

传感器有静态探测与闭环探测两种使用方法。

当机器人系统的探测和操作动作交替进行时,通常就要使用传感器,也就是说探测时机器人不操作,操作时与传感器无关,这种方法被称为静态探测,使用这种方法,视觉传感器先寻找被捕捉物体的位置与方向,然后机器人径直朝那个地点移动。

相反,闭式探测的机器人在操作运动中,始终受传感器的控制,多数视觉传感器都采用闭环模式,它们随时监测机器人的实际位置与理想位置间的偏差,并驱动机器人修正这一偏差。

在闭环探测中,即使物体在运动,例如在传送带上,机器人也能抓住它并把它送到预定位置。

相关文档
最新文档