外文翻译-简单结构管道检测机器人

合集下载

双轮链平面管道检测机器人外文文献翻译、中英文翻译、外文翻译

双轮链平面管道检测机器人外文文献翻译、中英文翻译、外文翻译

双轮链平面管道检测机器人摘要介绍了一种新型的多传感器管道检测机器人,用于80-100mm管道的检测。

该机器人的特点是只需使用两个轮链即可实现驱动和转向功能。

与普遍采用的三轮链条管道机器人相比,新设计允许简单的机器人控制和方便的用户界面,特别是在T形分支。

作为另一个优点,这种机器人的平面形状允许在机器人的两侧安装额外的传感器。

介绍了系统的运动学和三种控制方式。

最后,通过实验验证了该机器人系统的性能。

关键词:管道机器人;系统;运动学I.绪论管道检测机器人的功能可以描述为驱动、转向、检测和检索。

而用于直径小于100mm管道检测的机器人,在设计紧凑的转向机构和安装磁探头、超声波探头等传感器检测裂纹、破裂、泄漏等方面存在特殊困难。

管道机器人机构在机器人技术领域有着悠久的发展历史,按其运动模式可分为几种基本形式。

它们有轮式、尺蠖式、腿式、螺旋式、履带式和被动式。

其中轮式管道检测机器人最为流行,[1]-[8]。

然而,它们不适合在垂直路径或在t分支操作。

近10年来,人们对差动驱动型机构[9]-[11]进行了较为深入的研究。

差动驱动类型通常有三个动力轮链。

通过独立控制每个链条的速度,机器人可以通过肘部和t型分支。

然而,当只使用一个机器人模块时,有时会在T支[9]处发生奇异运动。

为了解决这一问题,已经开发了几种方法,如主动转向关节机构[12-13]或两个机器人模块[9]协作。

然而,整个机器人系统的体积变得庞大。

使用三个动力轮链的另一个缺点是没有足够的空间在机器人体内安装更多的传感器,因为三个轮链占据了管道的大部分横截面积,特别是直径小于100mm的管道。

目前,机器人身体前只安装了一个摄像头。

T. Okad等[14-16]开发了平板式管道检测机器人。

然而,他们的设计是复杂的,并用于大型管道。

针对这些因素,我们提出了一种双动力轮链的管道检测机构。

两个轮链以180度的角度分开布置,所以可以在机器人身体的两侧附加传感器。

各轮采用两台电机控制;一种用于驾驶,另一种用于驾驶。

机器人结构论文中英文对照资料外文翻译文献

机器人结构论文中英文对照资料外文翻译文献

中英文对照资料外文翻译文献FEM Optimization for Robot StructureAbstractIn optimal design for robot structures, design models need to he modified and computed repeatedly. Because modifying usually can not automatically be run, it consumes a lot of time. This paper gives a method that uses APDL language of ANSYS 5.5 software to generate an optimal control program, which mike optimal procedure run automatically and optimal efficiency be improved.1)IntroductionIndustrial robot is a kind of machine, which is controlled by computers. Because efficiency and maneuverability are higher than traditional machines, industrial robot is used extensively in industry. For the sake of efficiency and maneuverability, reducing mass and increasing stiffness is more important than traditional machines, in structure design of industrial robot.A lot of methods are used in optimization design of structure. Finite element method is a much effective method. In general, modeling and modifying are manual, which is feasible when model is simple. When model is complicated, optimization time is longer. In the longer optimization time, calculation time is usually very little, a majority of time is used for modeling and modifying. It is key of improving efficiency of structure optimization how to reduce modeling and modifying time.APDL language is an interactive development tool, which is based on ANSYS and is offered to program users. APDL language has typical function of some large computer languages. For example, parameter definition similar to constant and variable definition, branch and loop control, and macro call similar to function and subroutine call, etc. Besides these, it possesses powerful capability of mathematical calculation. The capability of mathematical calculation includes arithmetic calculation, comparison, rounding, and trigonometric function, exponential function and hyperbola function of standard FORTRAN language, etc. By means of APDL language, the data can be read and then calculated, which is in database of ANSYS program, and running process of ANSYS program can be controlled.Fig. 1 shows the main framework of a parallel robot with three bars. When the length of three bars are changed, conjunct end of three bars can follow a given track, where robot hand is installed. Core of top beam is triangle, owing to three bars used in the design, which is showed in Fig.2. Use of three bars makes top beam nonsymmetrical along the plane that is defined by two columns. According to a qualitative analysis from Fig.1, Stiffness values along z-axis are different at three joint locations on the top beam and stiffness at the location between bar 1 and top beam is lowest, which is confirmed by computing results of finite element, too. According to design goal, stiffness difference at three joint locations must he within a given tolerance. In consistent of stiffness will have influence on the motion accuracy of the manipulator under high load, so it is necessary to find the accurate location of top beam along x-axis.To the questions presented above, the general solution is to change the location of the top beam many times, compare the results and eventually find a proper position, The model will be modified according to the last calculating result each time. It is difficult to avoid mistakes if the iterative process is controlled manually and the iterative time is too long. The outer wall and inner rib shapes of the top beam will be changed after the model is modified. To find the appropriate location of top beam, the model needs to be modified repetitiously.Fig. 1 Solution of Original DesignThis paper gives an optimization solution to the position optimization question of the top beam by APDL language of ANSYS program. After the analysis model first founded, the optimization control program can be formed by means of modeling instruction in the log file. The later iterative optimization process can be finished by the optimization control program and do not need manual control. The time spent in modifying the model can be decreased to the ignorable extent. The efficiency of the optimization process is greatly improved.2)Construction of model for analysisThe structure shown in Fig. 1 consists of three parts: two columns, one beam and three driving bars. The columns and beam are joined by the bolts on the first horizontal rib located on top of the columns as shown in Fig.1. Because the driving bars are substituted by equivalentforces on the joint positions, their structure is ignored in the model.The core of the top beam is three joints and a hole with special purpose, which can not be changed. The other parts of the beam may be changed if needed. For the convenience of modeling, the core of the beam is formed into one component. In the process of optimization, only the core position of beam along x axis is changed, that is to say, shape of beam core is not changed. It should be noticed that, in the rest of beam, only shape is changed but the topology is not changed and which can automatically be performed by the control program.Fig.1, six bolts join the beam and two columns. The joint surface can not bear the pull stress in the non-bolt joint positions, in which it is better to set contact elements. When the model includes contact elements, nonlinear iterative calculation will be needed in the process of solution and the computing time will quickly increase. The trial computing result not including contact element shows that the outside of beam bears pulling stress and the inner of beam bears the press stress. Considering the primary analysis object is the joint position stiffness between the top beam and the three driving bars, contact elements may not used, hut constructs the geometry model of joint surface as Fig.2 showing. The upper surface and the undersurface share one key point in bolt-joint positions and the upper surface and the under surface separately possess own key points in no bolt positions. When meshed, one node will be created at shared key point, where columns and beam are joined, and two nodes will be created at non shared key point, where column and beam are separated. On right surface of left column and left surface of right column, according to trial computing result, the structure bears press stress. Therefore, the columns and beam will share all key points, not but at bolts. This can not only omit contact element but also show the characteristic of bolt joining. The joining between the bottoms of the columns and the base are treated as full constraint. Because the main aim of analysis is the stiffness of the top beam, it can be assumed that the joint positions hear the same as load between beam and the three driving bars. The structure is the thin wall cast and simulated by shell element . The thickness of the outside wall of the structure and the rib are not equal, so two groups of real constant should he set. For the convenience of modeling, the two columns are alsoset into another component. The components can create an assembly. In this way, the joint positions between the beam core and columns could he easily selected, in the modifying the model and modifying process can automatically be performed. Analysis model is showed Fig.1. Because model and load are symmetric, computing model is only half. So the total of elements is decreased to 8927 and the total of nodes is decreased to 4341. All elements are triangle.3.)Optimization solutionThe optimization process is essentially a computing and modifying process. The original design is used as initial condition of the iterative process. The ending condition of the process is that stiffness differences of the joint locations between three driving bars and top beam are less than given tolerance or iterative times exceed expected value. Considering the speciality of the question, it is foreseen that the location is existent where stiffness values are equal. If iterative is not convergent, the cause cannot be otherwise than inappropriate displacement increment or deficient iterative times. In order to make the iterative process convergent quickly and efficiently, this paper uses the bisection searching method changing step length to modify the top beam displacement. This method is a little complex but the requirement on the initial condition is relatively mild.The flow chart of optimization as follows:1. Read the beam model data in initial position from backup file;2. Modify the position of beam;3. Solve;4. Read the deform of nodes where beam and three bars are joined;5. Check whether the convergent conditions are satisfied, if not, then continue to modify the beam displacement and return to 3, otherwise, exit the iteration procedure.6. Save the results and then exit.The program's primary control codes and their function commentaries are given in it, of which the detailed modeling instructions are omitted. For the convenience of comparing with the control flow, the necessary notes are added.the flag of the batch file in ANSYSBATCH RESUME, robbak.db, 0read original data from the backupfile robbak,.db/PREP7 enter preprocessordelete the joint part between beam core and columnsmove the core of the beam by one :step lengthapply load and constraint on the geometry meshing thejoint position between beam core and columns FINISH exit the preprocessorISOLU enter solverSOLVE solveFINISH exit the solverPOST1 enter the postprocessor*GET ,front,NODE,2013,U,Z read the deformation of first joint node on beam*GET,back,NODE, 1441 ,U,Z read the deformation of second joint node on beam intoparameter hacklastdif-1 the absolute of initial difference between front and hacklast timeflag=- 1 the feasibility flag of the optimizationstep=0.05 the initial displacement from initial position to the currentposition*D0,1,1,10,1 the iteration procedure begin, the cycle variable is I andits value range is 1-10 and step length is 1dif=abs(front-back) the absolute of the difference between front and hack inthe current result*IF,dif,LE,l .OE-6,THEN check whether the absolute difference dif satisfies therequest or noflag=l yes, set flag equal to 1*EXIT exit the iterative calculation*ELSEIF,dif,GE,lastdif,THEN check whether the dif value becomes great or not flag=2yes, set flag 2 modify step length by bisection methodperform the next iterative calculation, use the lastposition as the current position and modified last steplength as the current step lengthELSE if the absolute of difference value is not less thanexpected value and become small gradually, continue tomove top beam read the initial condition from back upfile enter the preprocessorMEN, ,P51X, , , step,, , ,1 move the core of the beam by one step length modify thejoint positions between beam core and column applyload and constraint meshingFINISH exit preprocessorISOLU enter solverSOLVE solveFINISH exit the solver/POST1 exit the postprocessor*GET,front,NODE,201 3,U,Z read the deformation of first joint node to parameter front *GET,back,NODE, 144 1,U,Z read the deformation of second joint node to parameter back lastdif-dif update the value of last dif*ENDIF the end of the if-else*ENDDO the end of the DO cycleMost of the control program above is copied from log file, which is long. The total of lines is up to about 1000 lines. Many codes such as modeling and post-process codes are used repeatedly. To make the program construct clear, these instructions can he made into macros, which are called by main program. This can efficiently reduce the length of the main program. In addition, modeling instructions from log file includes lots of special instructions that are only used under graphic mode but useless under hatch mode. Deleting and modifying these instructions when under batch mode in ANSYS can reduce the length of the file, too.In the program above, the deformation at given position is read from node deformation. In meshing, in order to avoid generating had elements, triangle mesh is used. In optimization, the shape of joint position between columns and beam continually is changed. This makes total of elements different after meshing each time and then element numbering different, too. Data read from database according to node numbering might not he data to want. Therefore, beam core first needs to he meshed, then saved. When read next time, its numbering is the same as last time.Evaluating whether the final result is a feasible result or not needs to check the flag value. If only the flag value is I, the result is feasible, otherwise the most proper position is not found. The total displacement of top beam is saved in parameter step. If the result is feasible, the step value is the distance from initial position to the most proper position. The sum of iterative is saved in parameter 1. According to the final value of I, feasibility of analysis result and correctness of initial condition can he evaluated.4)Optimization resultsThe sum of iterative in optimization is seven, and it takes about 2 hour and 37 minutes to find optimal position. Fig.3 shows the deformation contour of the half-construct. In Fig.3, the deformations in three joints between beam and the three driving bars is the same as level, and the corresponding deformation range is between -0.133E-04 and -0.1 15E-O4m, the requirement of the same stiffness is reached. At this time, the position of beam core along x-axis as shown in Fig. 1 has moved -0.71E-01m compared with the original designed positionBecause the speed of computer reading instruction is much faster than modifying model manually, the time modifying model can be ignored. The time necessary foroptimization mostly depends on the time of solution. Compared with the optimization procedure manually modifying model, the efficiency is improved and mistake operating in modeling is avoided.5)ConclusionThe analyzing result reveals that the optimization method given in this paper is effective and reaches the expected goal. The first advantage of this method is that manual mistakes do not easily occur in optimization procedure. Secondly, it is pretty universal and the control codes given in this paper may he transplanted to use in similar structure optimization design without large modification. The disadvantage is that the topology structure of the optimization object can not be changed. The more the workload of modifying the model, the more the advantages of this method are shown. In addition, the topology optimization function provided in ANSYS is usedto solve the optimization problem that needs to change the topology structure.The better optimization results can he achieved if the method in this paper combined with it.中文译文:机器人机构优化设计有限元分析摘要机器人结构最优化设计,设计模型需要反复的修正和计算。

简单结构管道检测机器人(有出处)687--中英文翻译

简单结构管道检测机器人(有出处)687--中英文翻译

中文翻译:简单结构管道检测机器人本文介绍了管道检测机器人的原始架构。

该机器人由包含两个万向节铰接部分。

一部分是沿管道平行移动的轴车轮盘,而另一部分则是被迫遵循与螺旋运动有关的轴线旋转的车轮倾斜管。

单台电动机被放置在两个机构之间产生的运动。

所有车轮安装在悬架上,以适应不断变化的曲线管道。

该机器人有其自己的电池和无线链路。

四种不同管径分别为的170,70和40毫米。

对于较小的直径,电池,无线电接收器,可放置在其他额外的机构中。

这种架构非常简单,其旋转运动可以被利用来进行擦洗或检验任务。

关键词:自主移动机器人,在管道检测,螺旋运动管道检测机器人已经被研究了很长一段时间,许多原来的运动观念被提出来解决在管道直径,曲线和能源供应变化中有关的许多技术困难。

虽然一个详尽的文献回顾是不可能的,根据有限的可用空间,几大类别,可确定几大类别:1对于小规模,许多项目遵循蚯蚓原则:中央部分组成轴向移动,而两端连接的设备具有阻隔管道。

这一概念已经提出气动版本(如[1]),但他们需要电力脐带。

对于较小的直径(10毫米或更小),根据尺蠖的原则,或根据惯性运动由锯齿波电压驱动[2],或使用与微分摩擦系数振动鳍[3],则采用压电驱动。

2对于各种中型管道,根据直径的适应性和转弯能力古典机电系统已提出各种涉及车轮和轨道运动学结构。

3对于大型管道,管道爬行走路也已提出[6]。

文中提出的四个移动机器人属于第二类,他们的管直径从40到170mm,该设计尝试使用单一驱动器减少机器的复杂性实现沿管的的流动性,即使我们的研究可看作一个独立努力地结果。

但此螺旋论似乎已经被研究过。

体系结构图1该机器人主要分为两部分,定子和转子,包括一个DC 连接,马达与减速机,万向节。

定子配备了一套轮子,有助于运动平行; 在这种情形下,定子约束沿着管轴,而转子的车轮只能沿着螺旋轨迹,该机器人之间的轴向速度和旋转速度的关系 αtg R w v ∙∙=其中R 为管道半径, a 是车轮倾斜角度。

机器人外文文献翻译、中英文翻译

机器人外文文献翻译、中英文翻译

外文资料robotThe industrial robot is a tool that is used in the manufacturing environment to increase productivity. It can be used to do routine and tedious assembly line jobs,or it can perform jobs that might be hazardous to the human worker . For example ,one of the first industrial robot was used to replace the nuclear fuel rods in nuclear power plants. A human doing this job might be exposed to harmful amounts of radiation. The industrial robot can also operate on the assembly line,putting together small components,such as placing electronic components on a printed circuit board. Thus,the human worker can be relieved of the routine operation of this tedious task. Robots can also be programmed to defuse bombs,to serve the handicapped,and to perform functions in numerous applications in our society.The robot can be thought of as a machine that will move an end-of-tool ,sensor ,and/or gripper to a preprogrammed location. When the robot arrives at this location,it will perform some sort of task .This task could be welding,sealing,machine loading ,machine unloading,or a host of assembly jobs. Generally,this work can be accomplished without the involvement of a human being,except for programming and for turning the system on and off.The basic terminology of robotic systems is introduced in the following:1. A robot is a reprogrammable ,multifunctional manipulator designed to move parts,material,tool,or special devices through variable programmed motions for the performance of a variety of different task. This basic definition leads to other definitions,presented in the following paragraphs,that give acomplete picture of a robotic system.2. Preprogrammed locations are paths that the robot must follow to accomplish work,At some of these locations,the robot will stop and perform some operation ,such as assembly of parts,spray painting ,or welding .These preprogrammed locations are stored in the robot’s memory and are recalled later for continuousoperation.Furthermore,these preprogrammed locations,as well as other program data,can be changed later as the work requirements change.Thus,with regard to this programming feature,an industrial robot is very much like a computer ,where data can be stoned and later recalled and edited.3. The manipulator is the arm of the robot .It allows the robot to bend,reach,and twist.This movement is provided by the manipulator’s axes,also called the degrees of freedom of the robot .A robot can have from 3 to 16 axes.The term degrees of freedom will always relate to the number of axes found on a robot.4. The tooling and frippers are not part the robotic system itself;rather,they are attachments that fit on the end of the robot’s arm. These attachments connected to the end of the robot’s arm allow the robot to lift parts,spot-weld ,paint,arc-weld,drill,deburr,and do a variety of tasks,depending on what is required of the robot.5. The robotic system can control the work cell of the operating robot.The work cell of the robot is the total environment in which the robot must perform itstask.Included within this cell may be the controller ,the robot manipulator ,a work table ,safety features,or a conveyor.All the equipment that is required in order for the robot to do its job is included in the work cell .In addition,signals from outside devices can communicate with the robot to tell the robot when it should parts,pick up parts,or unload parts to a conveyor.The robotic system has three basic components: the manipulator,the controller,and the power source.A.ManipulatorThe manipulator ,which does the physical work of the robotic system,consists of two sections:the mechanical section and the attached appendage.The manipulator also has a base to which the appendages are attached.Fig.1 illustrates the connectionof the base and the appendage of a robot.图1.Basic components of a robot’s manipulatorThe base of the manipulator is usually fixed to the floor of the work area. Sometimes,though,the base may be movable. In this case,the base is attached to either a rail or a track,allowing the manipulator to be moved from one location to anther.As mentioned previously ,the appendage extends from the base of the robot. The appendage is the arm of the robot. It can be either a straight ,movable arm or a jointed arm. The jointed arm is also known as an articulated arm.The appendages of the robot manipulator give the manipulator its various axes of motion. These axes are attached to a fixed base ,which,in turn,is secured to a mounting. This mounting ensures that the manipulator will in one location.At the end of the arm ,a wrist(see Fig 2)is connected. The wrist is made up of additional axes and a wrist flange. The wrist flange allows the robot user to connect different tooling to the wrist for different jobs.图2.Elements of a work cell from the topThe manipulator’s axes allow it to perform work within a certain area. The area is called the work cell of the robot ,and its size corresponds to the size of the manipulator.(Fid2)illustrates the work cell of a typical assembly ro bot.As the robot’s physical size increases,the size of the work cell must also increase.The movement of the manipulator is controlled by actuator,or drive systems.The actuator,or drive systems,allows the various axes to move within the work cell. The drive system can use electric,hydraulic,or pneumatic power.The energy developed by the drive system is converted to mechanical power by various mechanical power systems.The drive systems are coupled through mechanical linkages.These linkages,in turn,drive the different axes of the robot.The mechanical linkages may be composed of chain,gear,and ball screws.B.ControllerThe controller in the robotic system is the heart of the operation .The controller stores preprogrammed information for later recall,controls peripheral devices,and communicates with computers within the plant for constant updates in production.The controller is used to control the robot manipulator’s movements as well as to control peripheral components within the work cell. The user can program the movements of the manipulator into the controller through the use of a hard-held teach pendant.This information is stored in the memory of the controller for later recall.The controller stores all program data for the robotic system.It can store several differentprograms,and any of these programs can be edited.The controller is also required to communicate with peripheral equipment within the work cell. For example,the controller has an input line that identifies when a machining operation is completed.When the machine cycle is completed,the input line turn on telling the controller to position the manipulator so that it can pick up the finished part.Then ,a new part is picked up by the manipulator and placed into the machine.Next,the controller signals the machine to start operation.The controller can be made from mechanically operated drums that step through a sequence of events.This type of controller operates with a very simple robotic system.The controllers found on the majority of robotic systems are more complex devices and represent state-of-the-art eletronoics.That is,they are microprocessor-operated.these microprocessors are either 8-bit,16-bit,or 32-bit processors.this power allows the controller to be very flexible in its operation.The controller can send electric signals over communication lines that allow it to talk with the various axes of the manipulator. This two-way communication between the robot manipulator and the controller maintains a constant update of the end the operation of the system.The controller also controls any tooling placed on the end of the robot’s wrist.The controller also has the job of communicating with the different plant computers. The communication link establishes the robot as part a computer-assisted manufacturing (CAM)system.As the basic definition stated,the robot is a reprogrammable,multifunctional manipulator.Therefore,the controller must contain some of memory stage. The microprocessor-based systems operates in conjunction with solid-state devices.These memory devices may be magnetic bubbles,random-access memory,floppy disks,or magnetic tape.Each memory storage device stores program information fir or for editing.C.power supplyThe power supply is the unit that supplies power to the controller and the manipulator. The type of power are delivered to the robotic system. One type of power is the AC power for operation of the controller. The other type of power isused for driving the various axes of the manipulator. For example,if the robot manipulator is controlled by hydraulic or pneumatic drives,control signals are sent to these devices causing motion of the robot.For each robotic system,power is required to operate the manipulator .This power can be developed from either a hydraulic power source,a pneumatic power source,or an electric power source.There power sources are part of the total components of the robotic work cell.中文翻译机器人工业机器人是在生产环境中用以提高生产效率的工具,它能做常规乏味的装配线工作,或能做那些对于工人来说是危险的工作,例如,第一代工业机器人是用来在核电站中更换核燃料棒,如果人去做这项工作,将会遭受有害放射线的辐射。

翻译:管道检测机器人

翻译:管道检测机器人

译文600:管道检测机器人合力-PIPE家族包括四种不同类型的内检测机器人。

该机器人具有与铰接万向节两部分。

一个部分(定子)是由一组轮子移动平行于管子轴线的沿管子引导,而另一部分(转子)是被迫遵循一个螺旋运动由于倾斜的轮子绕的轴管。

单电机(带齿轮减速机内置)被放置在两个机构之间,以产生运动(没有直动式轮毂)。

所有的车轮都安装在一个悬挂以适应不断变化的管直径和曲线在管。

该机器人是自主的,并进行他们自己的电池和无线链路(带齿轮减速机内置)被放置在两个机构之间,以产生运动(没有直动式轮毂)。

所有的车轮都安装在一个悬挂以适应不断变化的管直径和曲线在管。

该机器人是自主的,并进行他们自己的电池和无线链路。

D-170是一个机器人170毫米管径与刚性地连接到设计用于小的曲率(半径大于600 mm)的管道马达(放置在定子)的轴线在转子上。

图1:D-170直升机管道机器人电影1:D-170侧视图电影2:D-170顶视图D-70/1是机器人的第一个原型70毫米管径设计为弯曲的管道(半径超过170毫米的)。

与背隙万向节放置在定子和转子之间。

图2:D-70/1 HELI-管道机器人电影:D-70在弯管Ð-70/2的第二架原型机具有同样用途为D-70/1的替代设计。

在这个体系结构中,马达,电池和无线链路被安装在转子上。

图3:D-2分之70合力管道机器人D-40是一个机器人40毫米管径与由两个万向节分离的三个机构的弯曲部分(曲率大于110毫米的半径)。

第一个由具有倾斜的轮子转子,第二个包括电动机和减速器,第三个是与轴向车轮,能源供应和电信定子。

图4:D-40直升机管道机器人电影1:D-40侧视图电影2:D-40近观原型的自治是约两个小时。

这种架构非常简单,将旋转运动可以被利用来进行或检验任务。

机器人外文翻译(中英文翻译)

机器人外文翻译(中英文翻译)

机器人外文翻译(中英文翻译)机器人外文翻译(中英文翻译)With the rapid development of technology, the use of robots has become increasingly prevalent in various industries. Robots are now commonly employed to perform tasks that are dangerous, repetitive, or require a high level of precision. However, in order for robots to effectively communicate with humans and fulfill their intended functions, accurate translation between different languages is crucial. In this article, we will explore the importance of machine translation in enabling robots to perform translation tasks, as well as discuss current advancements and challenges in this field.1. IntroductionMachine translation refers to the use of computer algorithms to automatically translate text or speech from one language to another. The ultimate goal of machine translation is to produce translations that are as accurate and natural as those generated by human translators. In the context of robots, machine translation plays a vital role in allowing them to understand and respond to human commands, as well as facilitating communication between robots of different origins.2. Advancements in Machine TranslationThe field of machine translation has experienced significant advancements in recent years, thanks to breakthroughs in artificial intelligence and deep learning. These advancements have led to the development of neural machine translation (NMT) systems, which have greatly improved translation quality. NMT models operate by analyzinglarge amounts of bilingual data, allowing them to learn the syntactic and semantic structures of different languages. As a result, NMT systems are capable of providing more accurate translations compared to traditional rule-based or statistical machine translation approaches.3. Challenges in Machine Translation for RobotsAlthough the advancements in machine translation have greatly improved translation quality, there are still challenges that need to be addressed when applying machine translation to robots. One prominent challenge is the variability of language use, including slang, idioms, and cultural references. These nuances can pose difficulties for machine translation systems, as they often require a deep understanding of the context and cultural background. Researchers are currently working on developing techniques to enhance the ability of machine translation systems to handle such linguistic variations.Another challenge is the real-time requirement of translation in a robotic setting. Robots often need to process and translate information on the fly, and any delay in translation can affect the overall performance and efficiency of the robot. Optimizing translation speed without sacrificing translation quality is an ongoing challenge for researchers in the field.4. Applications of Robot TranslationThe ability for robots to translate languages opens up a wide range of applications in various industries. One application is in the field of customer service, where robots can assist customers in multiple languages, providing support and information. Another application is in healthcare settings, where robots can act as interpreters between healthcare professionals and patientswho may speak different languages. Moreover, in international business and diplomacy, robots equipped with translation capabilities can bridge language barriers and facilitate effective communication between parties.5. ConclusionIn conclusion, machine translation plays a crucial role in enabling robots to effectively communicate with humans and fulfill their intended functions. The advancements in neural machine translation have greatly improved translation quality, but challenges such as language variability and real-time translation requirements still exist. With continuous research and innovation, the future of machine translation for robots holds great potential in various industries, revolutionizing the way we communicate and interact with technology.。

CCTV管道检测机器人操作维护

CCTV管道检测机器人操作维护

CCTV管道检测机器人操作维护1. 简介CCTV(Close Circuit Television)管道检测机器人是一种专门用于检测管道的机器人。

它可以进入地下或难以到达的地方,检测管道是否存在损坏或堵塞情况,以及进行修复或清理。

本文将认真介绍该机器人的操作和维护。

2. 操作2.1 准备工作在操作CCTV管道检测机器人之前,需要进行一些准备工作。

首先,需要确认管道的类型和大小,以确定选择合适的机器人。

其次,需要检查全部设备是否处于正常工作状态,例如电源、信号线和人机界面等。

最后,需要进行安全防范措施,确保操作人员的安全。

2.2 操作步骤CCTV管道检测机器人的操作流程一般包括以下步骤:1.将机器人插入管道后,启动机器人并将其移动到待检测区域;2.打开机器人的摄像头和灯光,对管道进行拍照和录像;3.对管道进行数据采集,并记录到操作人员的手持设备中;4.如发觉管道问题,适时更改机器人工作模式,例如清理、修补或更换设备等;5.检测完后,将机器人从管道中取出并检查全部设备是否完好。

2.3 操作注意事项在CCTV管道检测机器人的操作过程中,需要注意以下事项:1.操作前需要进行充分的准备;2.操作人员需要严格依照操作步骤,并严格遵守安全规定;3.操作人员需要认真察看机器人的工作状态,并适时选择机器人的工作模式;4.全程记录数据,并适时处理检测出的问题。

3. 维护3.1 维护类型CCTV管道检测机器人的维护一般分为两种类型,一种是定期保养型维护,另一种是故障维护。

定期保养型维护:定期对机器人进行检查和保养,以确保机器人状态良好,工作稳定,操作安全牢靠。

故障维护:当机器人显现故障时,进行适时修理和更换设备,以确保机器人的正常工作。

3.2 维护工作CCTV管道检测机器人的维护工作一般包括以下内容:1.检查机器人的外观,确保机器人无严重损坏;2.检查机器人的电源,并确保电源可以正常供电;3.检查机器人的传感器和信号电线,确保传感器和信号电线的正常工作状态;4.检查机器人的工作模式,并更换损坏的设备;5.清理机器人的传感器和摄像头,以确保数据采集的质量。

外文翻译--机器人技术简介

外文翻译--机器人技术简介

Introduction to robotics technologyIn the manufacturing field, robot development has focused on engineering robotic arms that perform manufacturing processes. In the space industry, robotics focuses on highly specialized, one-of-kind planetary rovers. Unlike a highly automated manufacturing plant, a planetary rover operating on the dark side of the moon -- without radio communication -- might run into unexpected situations. At a minimum, a planetary rover must have some source of sensory input, some way of interpreting that input, and a way of modifying its actions to respond to a changing world. Furthermore, the need to sense and adapt to a partially unknown environment requires intelligence (in other words, artificial intelligence).Mechanical platforms -- the hardware baseA robot consists of two main parts: the robot body and some form of artificial intelligence (AI) system. Many different body parts can be called a robot. Articulated arms are used in welding and painting; gantry and conveyor systems move parts in factories; and giant robotic machines move earth deep inside mines. One of the most interesting aspects of robots in general is their behavior, which requires a form of intelligence. The simplest behavior of a robot is locomotion. Typically, wheels are used as the underlying mechanism to make a robot move from one point to the next. And some force such as electricity is required to make the wheels turn under command.MotorsA variety of electric motors provide power to robots, allowing them to move material, parts, tools, or specialized devices with variousprogrammed motions. The efficiency rating of a motor describes how much of the electricity consumed is converted to mechanical energy. Let's take a look at some of the mechanical devices that are currently being used in modern robotics technology.Driving mechanismsGears and chains:Gears and chains are mechanical platforms that provide a strong and accurate way to transmit rotary motion from one place to another, possibly changing it along the way. The speed change between two gears depends upon the number of teeth on each gear. When a powered gear goes through a full rotation, it pulls the chain by the number of teeth on that gear.Pulleys and belts:Pulleys and belts, two other types of mechanical platforms used in robots, work the same way as gears and chains. Pulleys are wheels with a groove around the edge, and belts are the rubber loops that fit in that groove.Gearboxes:A gearbox operates on the same principles as the gear and chain, without the chain. Gearboxes require closer tolerances, since instead of using a large loose chain to transfer force and adjust for misalignments, the gears mesh directly with each other. Examples of gearboxes can be found on the transmission in a car, the timing mechanism in a grandfather clock, and the paper-feed of your printer.Power suppliesPower supplies are generally provided by two types of battery. Primary batteries are used once and then discarded; secondary batteries operate from a (mostly) reversible chemical reaction and can be recharged several times. Primary batteries have higher density and a lower self-dischargerate. Secondary (rechargeable) batteries have less energy than primary batteries, but can be recharged up to a thousand times depending on their chemistry and environment. Typically the first use of a rechargeable battery gives 4 hours of continuous operation in an application or robot.SensorsRobots react according to a basic temporal measurement, requiring different kinds of sensors.In most systems a sense of time is built-in through the circuits and programming. For this to be productive in practice, a robot has to have perceptual hardware and software, which updates quickly. Regardless of sensor hardware or software, sensing and sensors can be thought of as interacting with external events (in other words, the outside world). The sensor measures some attribute of the world. The term transducer is often used interchangeably with sensor. A transducer is the mechanism, or element, of the sensor that transforms the energy associated with what is being measured into another form of energy. A sensor receives energy and transmits a signal to a display or computer. Sensors use transducers to change the input signal (sound, light, pressure, temperature, etc.) into an analog or digital form capable of being used by a robot.Microcontroller systemsMicrocontrollers (MCUs) are intelligent electronic devices used inside robots. They deliver functions similar to those performed by a microprocessor (central processing unit, or CPU) inside a personal computer. MCUs are slower and can address less memory than CPUs, but are designed for real-world control problems. One of the major differences between CPUs and MCUs is the number of external components needed tooperate them. MCUs can often run with zero external parts, and typically need only an external crystal or oscillator.Utilities and toolsROBOOP (A robotics object oriented package in C++):This package is an object-oriented toolbox in C++ for robotics simulation. Technical references and downloads are provided in the Resources.CORBA: A real-time communications and object request broker software package for embedding distributed software agents. Each independent piece of software registers itself and its capabilities to the ORB, by means of an IDL (Interface Definition Language). Visit their Web site (see Resources) for technical information, downloads, and documentation for CORBA.TANGO/TACO:This software might be useful for controlling a robotics system with multiple devices and tools. TANGO is an object oriented control system based on CORBA. Device servers can be written in C++ or Java. TACO is object oriented because it treats all(physical and logical) control points in a control system as objects in a distributed environment. All actions are implemented in classes. New classes can be constructed out of existing classes in a hierarchical manner, thereby ensuring a high level of software reuse. Classes can be written in C++, in C (using a methodology called Objects in C), in Python or in LabView (using the G programming language).ControllersTask Control Architecture: The Task Control Architecture (TCA) simplifies building task-level control systems for mobile robots. "Task-level" refers to the integration and coordination of perception, planning, andreal time control to achieve a given set of goals (tasks). TCA provides a general control framework, and is intended to control a wide variety of robots. TCA provides a high-level machine-independent method for passing messages between distributed machines (including between Lisp and C processes). TCA provides control functions, such as task decomposition, monitoring, and resource management, that are common to many mobile robot applications. The Resources section provides technical references and download information for Task Control Architecture.EMC (Enhanced Machine Controller): The EMC software is based on the NIST Real time Control System (RCS) methodology, and is programmed using the NIST RCS Library. The RCS Library eases the porting of controller code to a variety of UNIX and Microsoft platforms, providing a neutral application programming interface (API) to operating system resources such as shared memory, semaphores and timers. The EMC software is written in C and C++, and has been ported to the PC Linux, Windows NT, and Sun Solaris operating systems.Darwin2K: Darwin2K is a free, open source toolkit for robot simulation and automated design. It features numerous simulation capabilities and an evolutionary algorithm capable of automatically synthesizing and optimizing robot designs to meet task-specific performance objectives.LanguagesRoboML (Robotic Markup Language): RoboML is used for standardized representation of robotics-related data. It is designed to support communication language between human-robot interface agents, as well as between robot-hosted processes and between interface processes, and to provide a format for archived data used by human-robot interface agents.ROSSUM: A programming and simulation environment for mobile robots. The Rossum Project is an attempt to help collect, develop, and distribute software for robotics applications. The Rossum Project hopes to extend the same kind of collaboration to the development of robotic software.XRCL (Extensible Robot Control Language): XRCL (pronounced zircle) is a relatively simple, modern language and environment designed to allow robotics researchers to share ideas by sharing code. It is an open source project, protected by the GNU Copyleft.SummaryThe field of robotics has created a large class of robots with basic physical and navigational competencies. At the same time, society has begun to move towards incorporating robots into everyday life, from entertainment to health care. Moreover, robots could free a large number of people from hazardous situations, essentially allowing them to be used as replacements for human beings. Many of the applications being pursued by AI robotics researchers are already fulfilling that potential. In addition, robots can be used for more commonplace tasks such as janitorial work. Whereas robots were initially developed for dirty, dull, and dangerous applications, they are now being considered as personal assistants. Regardless of application, robots will require more rather than less intelligence, and will thereby have a significant impact on our society in the future as technology expands to new horizons.外文出处:Robotic technology / edited by A. Pugh./P. Peregrinus, c1993.附件1:外文资料翻译译文机器人技术简介在制造业领域,机器人的开发集中在执行制造过程的工程机器人手臂上。

机器人的组成外文翻译

机器人的组成外文翻译

附录外文翻译:机器人的组成(1)hobby engineeringIts easier to learn about building robots if you take it one step at a time. This menu breaks a fairly complex robot into bite-sized pieces (or byte-sized for you programmers) to make the information easier to digest. We think that this is a good way to both learn about robots and to plan your actual construction. If you try to do everything at once you are likely to end up with a mess. If you plan and build in small steps, you are almost guaranteed success.(2)ControllersThe controller is the brains of your computer. The controller receives information from sensors and the input part of the human interface. It then decides what to do and sends instructions to the motion systems, actuators and the output part of the human interface. There are many types of controllers with different amounts of processing power and varying numbers and types of "pins" which connect to sensors, motors and the other part of the robot.In order for your controller or your robot to do anything, you must write a program and load that program into the controller's memory. Depending on your choice of controller, this can be fairly simple or extremely complicated. We recommend controllers using the Parallax Basic Stamp as the starting point for almost everyone. These controllers are programmed in an easy-to-learn language and have an integrated program loading system that is nearly 100% reliable. The Basic Stamp almost guarantees a fast start in programming whereas most alternatives require overcoming a significant level of difficulty just to get started. Even if you intend to "graduate" to more complex programming systems you will probably find the Basic Stamp a useful tool for investigating new ideas before developing your production code.(3)SensorsSensors provide your robot with information about its environment. Different sensors tell your robot about sights, sounds, pressures, temperatures and many other characteristics of the world around it.In many cases sensor components provide "data" when what you want is "information". Forinstance, a sonar component may report that an echo came back in .05ms when what you really want to know is that a robot is charging you from two feet away. In some cases the volume of data from sensor components is more than can be handled by a robot controller -- too much data can be as useless as no data at all. Because of this, many of sensor products are actually "smart" subsystems with specialized logic to evaluate the data stream and simplify programming your robot's main controller.When considering sensors, your first step is to identify what you want your robot to sense and how quickly and reliably you want to acquire that information. While "I want to know everything, right now, without error" sounds like a good specification, it probably isn't achievable and it definitely wouldn't be affordable. All practical sensors have definite limits of accuracy, range, resolution and repeatability. Each little increase in performance requires a large increase in cost so you will often accept what you can afford rather than insist on what you would like.While sensors are warranted to meet their specifications, they aren't guaranteed to do what you want in the way you want. While an IR distance sensor may be 99% accurate in the testing lab, your results may be less perfect in a competitive environment when your sensors may get confused by random reflections, your opponents sensors or even intentional interference. In order to be fully effective, you may need to compare the results from multiple sensors and/or filter the data to ignore results that seem inconsistent. As with every part of your robot, maximum effectiveness requires careful evaluation of real-world results and fine-tuning of your robot's circuits and program. This is not a "Plug And Play" hobby!When evaluating sensors, you want to know the following:○1What is actually being measured? For example, most distance sensors don't really measure distance. They measure how long it takes to receive an echo after they send a signal. You have to consider the possibility that the echo" is actually a stray signal and then find a way to eliminate those false readings. Reliable distance tracking systems usually look for patterns of consistent readings○2How many connections of what kind are required to connect the sensor to your controller? Do you have enough of those kinds of pins available on your controller? Does the signal need to be processed though an ADC or other hardware device to be usable by your program?○3How much power does the sensor require and at what voltage(s)? Will you have to increase your robots battery and power regulation capacity?A thorough technical evaluation of a sensor may require more knowledge than you possess. Fortunately, you can generally get good results by relying on common sense and the helpful nature of most other builders. The sensors we offer have all been used successful by builders of varying levels, so you can feel confident that you aren't attempting the impossible when you select one of the sensors we offer. We have tried to write product descriptions that translate the technical specifications into common English (common American to our off-shore friends) -- but keep in mind that something can get lost in any translation. Finally a search of the WEB will find you many examples of circuits and programs. (In the near future we will have our own samples posted with each product description.)(4)Robot Base KitsThe base is your robots skelton and it main functions are to hold all the other parts together and to protect delicate parts from harm. A base can be as simple as a scrap piece of wood or as complicated as a space ship.In many cases the design of the base is completely intertwined with the design of the motion systems. Sometimes the mechanical components for a robot can be "borrowed" from a toy or other hobby. Radio controlled planes, cars and boats (including submarines) have been used to provide the base and motion system of robots.(5)Human Communications SystemsYour robot can "talk" to you via computer generated voice, blinking lights and text displays. It can "listen" to your instructions sent by keyboard, switches or wireless remote control. Computer people prefer to use the word "output" instead of "talk" and "input" instead of "listen", but you know what they mean.(6)ActuatorsAn actuator is any device that makes your robot do something. Motion Systems and the output part of the Human Communications Systems are just specialized actuators which are important enough that we though they deserved their own sections.Actuators can move things or control other devices. Almost any device operated with electricitycan become an actuator. Depending on the device it may be connected directly to the controller or indirectly by an H-Bridge or relay. Your robot can also control things remotely using radio frequency or infrared transmission or even over the internet. The X-10 home control system allows your robot to control household lights and appliances.(7)Motion SystemsRobots are usually moved by a combination of wheels, gears, motors and associated electronics. Sometimes motor systems are assembled piece-by-piece but the most common form of robot motion these days comes from servos similar to the kind used with radio controlled airplanes. While most robots roll, it is possible to build robots that walk, jump or even swim or fly.A continuous rotation servo is a modified hobby servo that can rotate 360 degrees in either direction. These servos are economical and provide a neat system of motors, gears and electronics that can be directly connected to most robot controllers. A number of different types of wheels are available which attach directly to the servo axle. Most simple robots use two servos to provide both motion and direction control. Direction is controlled by what is called "differential steering" -- steering by varying the speed and direction of each wheel. If your robot needs to turn left, just slow or stop the left wheel servo while maintaining or increasing the speed of the right wheel. The bigger the difference in speed, the sharper the turn.Motion systems can become as complicated as you choose and often require custom design and building of the mechanical and electronic components. Sometimes the mechanical components can be "borrowed" from a toy or other hobby. Radio controlled planes, cars and boats (including submarines) have been used to provide the base and motion system of robots.An H-Bridge is an electronic circuit which translates and boosts controller output signals to the level required to drive a standard electric motor with variable speed and direction. An H-Bridge is built into hobby servos, so they can be connected directly to the controller. When using other types of motors you need to provide your own H-Bridge. These can be purchased as completed assemblies or assembled from components.(1)爱好工程如果你采取每次一小步的策略。

URT管道检测机器人

URT管道检测机器人

管道检测机器人在城市污水、天然气输送、工业物料运输、给排水和建筑物通风系统等领域里,管道作为一种有效的物料输送手段而广泛应用。

为提高管道的寿命、防止泄漏等事故的发生,就必须对管道进行有效的检测维护等,而目前管道检测和维护多采用管道机器人来进行[1]。

所谓管道机器人就是一种可沿管道内部或外部自动行走、携带一种或多种传感器件如位置和姿态传感器、超声传感器、涡流传感器等以及操作机械如管道裂纹与管道接口焊接装置、防腐喷涂装置、操作手、喷枪、刷子等,在工作人员的遥控操纵或计算机控制下可在极其恶劣的环境中,能够完成一系列管道检测维修作业的机电一体化系统。

管道机器人可完成的管道作业有:生产、施工过程中的管道内外质量检测;管道内部清扫、抛光、焊接、喷涂等维护;对接焊缝的探伤、补口作业;旧管道腐蚀程度、破损情况检测和泄漏预报等等[2 3]。

1 管道机器人的发展状况1.1 管道机器人的理论研究发展状况管道机器人的研究所涉及的面很广,随着70年代电子技术、计算机技术、自动化技术的发展和进步,国外的管道机器人技术自90年代初以来得到了迅猛发展并接近于应用水平。

1987年日本学者T.Morimitsu 等人成功研制了一种振动式管内移动机器人。

1999年西班牙Jorge Moraleda与Anibal Ollero等人在西班牙军工基金资助下,利用水流喷射产生的冲力作为驱动力研制成检测地下输水管道内部状况的管道机器人系统。

2000年日本横滨国立大学电子与计算机工程系Chi Zhu等人研制成功用于检测污水排放管道的管道检测机器人,它适用于直径为200mm的管道。

2001年美国纽约煤气集团公司的Daphne D Zurko和卡内基梅隆大学机器人技术学院Hagen Schempf博士在美国国家航空和宇宙航行局的资助下开发了长距离、无缆方式的管道机器人系统。

我国管道机器人研制工作起步较晚,已见报道的管道机器人多为国外进口,然而近些年来,管道机器人的经济、技术和社会意义逐渐为更多的人们所认识,也有一些单位开始进行研制,并在机构模型、动力学分析以及实验样机等方面均有所建树。

机器人机构设计中英文对照外文翻译文献

机器人机构设计中英文对照外文翻译文献
机器人机构设计中英文对照外文翻译文献
(
FEM Optimization for Robot Structure
Wang Shijun, Zhao Jinjuan*
Department of Mechanical Engineering, Xi'anUniversity of Technology
Shaanxi Province, People's Republic of Chinarobot is a kind of machine, which is controlledby computers. Because efficiency and maneuverabilityare higher than traditional machines, industrial robot isused extensively in industry. For thesakeof efficiencyand maneuverability, reducing mass and increasingstiffness is more important than traditional machines,instructure design of industrial robot.
Fig. 1 shows the main framework of a parallel robot withthree bars. When the length of three bars are changed,conjunct end of three bars can follow a given track,where robot handisinstalled. Coreof top beamistriangle, owing to three bars used inthedesign,whichisshowed in Fig.2. Use of three bars makes top beamnonsymmetrical along the plane that is defined by twocolumns. According to a qualitative analysis from Fig.1,Stiffness values along z-axis are different at three jointlocations on the top beam and stiffness at the locationbetween bar 1 and top beam is lowest, which isconfirmed by computing results of finite element, too.According to design goal, stiffness difference at threejoint locations must he within a given tolerance.Inconsistent of stiffness will have influence on themotion accuracy of the manipulator under high load,soit is necessary to find the accurate location of top beamalong x-axis.

机器人外文翻译(中英文翻译)

机器人外文翻译(中英文翻译)

外文翻译机器人The robot性质: □毕业设计□毕业论文教学院:机电工程学院系别:机械设计制造及其自动化学生学号:学生姓名:专业班级:指导教师:职称:起止日期:机器人1.机器人的作用机器人是高级整合控制论、机械电子、计算机、材料和仿生学的产物。

在工业、医学、农业、建筑业甚至军事等领域中均有重要用途。

现在,国际上对机器人的概念已经逐渐趋近一致。

一般说来,人们都可以接受这种说法,即机器人是靠自身动力和控制能力来实现各种功能的一种机器。

联合国标准化组织采纳了美国机器人协会给机器人下的定义:“一种可编程和多功能的,用来搬运材料、零件、工具的操作机;或是为了执行不同的任务而具有可改变和可编程动作的专门系统。

2.能力评价标准机器人能力的评价标准包括:智能,指感觉和感知,包括记忆、运算、比较、鉴别、判断、决策、学习和逻辑推理等;机能,指变通性、通用性或空间占有性等;物理能,指力、速度、连续运行能力、可靠性、联用性、寿命等。

因此,可以说机器人是具有生物功能的三维空间坐标机器。

3.机器人的组成机器人一般由执行机构、驱动装置、检测装置和控制系统等组成。

执行机构即机器人本体,其臂部一般采用空间开链连杆机构,其中的运动副(转动副或移动副)常称为关节,关节个数通常即为机器人的自由度数。

根据关节配置型式和运动坐标形式的不同,机器人执行机构可分为直角坐标式、圆柱坐标式、极坐标式和关节坐标式等类型。

出于拟人化的考虑,常将机器人本体的有关部位分别称为基座、腰部、臂部、腕部、手部(夹持器或末端执行器)和行走部(对于移动机器人)等。

驱动装置是驱使执行机构运动的机构,按照控制系统发出的指令信号,借助于动力元件使机器人进行动作。

它输入的是电信号,输出的是线、角位移量。

机器人使用的驱动装置主要是电力驱动装置,如步进电机、伺服电机等,此外也有采用液压、气动等驱动装置。

检测装置的作用是实时检测机器人的运动及工作情况,根据需要反馈给控制系统,与设定信息进行比较后,对执行机构进行调整,以保证机器人的动作符合预定的要求。

外文翻译--- 萤光灯管检测室内移动机器人

外文翻译---  萤光灯管检测室内移动机器人

毕业设计外文资料翻译题目荧光管检测室内移动机器人专业机械设计制造及其自动化班级学生学号指导教师二〇一二年四月八日Autonomous Indoor Mobile Robot Navigation by detectingFluorescentTubesFabien LAUNAY Akihisa OHYA Shin’ichi YUTA Intelligent Robot Laboratory, University of Tsukuba 1-1-1 Tennoudai,Tsukuba, Ibaraki 305-8573 JAPAN{launay,ohya,yuta}@roboken.esys.tsukuba.ac.jpAbstractThis paper proposes an indoor navigation system for an autonomous mobile robot including the teaching of its environment. The self-localization of the vehicle is done by detecting the position and orientation of fluorescent tubes located above it’s desired path thanks to a camera pointing to the ceiling.A map of the lights based on odometry data is built in advance by the robot guided by an operator. Then a graphic user interface is used to define the trajectory the robot must follow with respect to the lights. While the robot is moving, the position and orientation of the lights it detects are compared to the map values, which enables the vehicle to cancel odometry errors.1 IntroductionWhen a wheel type mobile robot navigates on a two dimensional plane, it can use sensors to know its relative localization by summing elementary displacements provided by incremental encoders mounted on its wheels. The main default of this method known as odometry is that its estimation error tends to increase unboundedly[1]. For long distance navigation, odometry and other dead reckoning solutions may be supported by an absolute localization technique providing position information with a low frequency.Absolute localization in indoor navigation using landmarks located on the ground or on the walls is sometimes difficult to implement since different objects can obstruct them. Therefore a navigation system based on ceiling landmark recognition can be thought as an alternative to this issue.The navigation system we developed consists in two steps. In the first step, the vehicle is provided with a map of the ceiling lights. Building such a map by hand quickly becomes a heavy task as its size grows. Instead, the robot is guided manually under each light and builds the map automatically. The second step consists in defining a navigation path for the vehicleand enabling its position and orientation correction whenever it detects a light recorded previously in the map.Since the map built by the robot is based on odometry whose estimation error grows unboundedly, the position and orientation of the lights in the map do not correspond to the reality. However, if the trajectory to be followed by the vehicle during the navigation process is defined appropriately above this distorted map, it will be possible for the robot to move along any desired trajectory in the real world.A GUI has been developed in order to facilitate this map-based path definition process.We equipped a mobile robot with a camera pointing to the ceiling. During the navigation process, when a light is detected, the robot calculates the position and the orientation of this landmark in its own reference and thanks to a map of the lights built in advance, it can estimate its absolute position and orientation with respect to its map.We define the pose of an object as its position and orientation with respect to a given referential.2 Related workThe idea of using lights as landmarks for indoor navigation is not new. Hashino[2] developed a fluorescent light sensor in order to detect the inclination angle between an unmanned vehicle and a fluorescent lamp attached to the ceiling. The objective was to carry out the main part of the process by hardware logic circuit.Instead of lights, openings in the ceiling for aerations have also been used as landmarks to track.Oota et al.[3] based this tracking on edge detection, whereas Fukuda[4] developed a more complex system using fuzzy template matching. Hashiba et al.[5] used the development images of the ceiling to propose a motion planning method. More recently, Amat et al.[6] presented a vision based navigation system using several fluorescent light tubes located in captured images whose absolute pose estimation accuracy is better than a GPS system.One advantage of the system proposed here is its low memory and processing speed requirements that make its implementation possible on a robot with limited image-processing hardware. Moreover, our navigation system includes a landmarks map construction process entirely based on the robot’s odometry data. The development of a GUI enables the connection between the lights map produced during the teaching process, and the autonomous robot navigation, which results in a complete navigation system. This is the main difference with the previous works which either assume the knowledge of the ceiling landmarks’exact pose thanks to CAD data of building maps, or require the absolute vehicle pose to be entered manually and periodically during the landmarks map construction so as to cancel odometry errors.Figure 1: Target environment consisting of lights of different shapes in corridors exposed to luminosityvariations due to sunnin g.3 Lights’map buildingIn order to cancel odometry errors whenever a light is detected, the robot needs to know in advancethe pose in a given referential of the lights under which it is supposed to navigate.Since we are aiming at long distance autonomous indoor navigation, the size of the landmarks map is unbounded. Building such a map manually becomes a heavy task for the operator and we believe that an autonomous mobile robot can cope with this issue.During the learning process, the vehicle equipped with a camera pointing to the ceiling is guided manually under each light and adds landmark information to the map whenever a new light appears above its path. This human assisted map building is the first step of our research concerning landmarks map building. We want to change it to a fully autonomous map building system. As the image-processing involved during the learning process is identical to the one used during the navigation, we will present the feature extraction method in sections 5 and 6.Once the teaching phase is completed, the robot holds a map of the lights that can be used later for the autonomous navigation process.4 Dealing with a robot-made map4.1 Odometry error’s influence on the mapAsking the robot to build a map implies dealing\ with odometry errors that will occur during the learning process itself. As the robot will be guided under new lights, because of the accumulation of odometry errors, the pose of the landmarks recorded in the map will become more and more different from the values corresponding to the real world.Several maps of the environment represented in Fig.1 are given in Fig.2. The odometry data recorded by the robot during the learning process has also been represented for one of the maps.4.2 Usage of the mapOnly one map is needed by the robot to correct its pose during the navigation process. Whenever the robot detects a light learnt previously, it corrects its absolute pose1 by using the landmark’s information recorded in the map. Since the map contents don’t correspond to the values of the real world, the trajectory of the robot has to be specified according to the pose of the lights in the map, and not according to the trajectory we want the robot to follow in its real environment.For example, if the mobile robot’s task is to navigate right below a straight corridor’s lights, the robot won’t be requested to follow a straight line along the middle of the corridor. Instead of this simple motion command, the robot will have to trace every segment which connects the projection on the ground of the center of two successive lights. This is illustrated in Fig.3 where a zoom of the trajectory specified to the robot appears in dotted line.A GUI has been developed in Tcl/Tk in order to specify easily different types of trajectories with respect to the map learnt by the robot. This GUI can also be used on-line in order to follow the evolution of the robot in real time on the landmarks map during the learning and navigation processes.Figure 2: Several maps of the environment represented Fig.1 built by the same robot. Rectangles andcircles represent lights of different shapes.5 Fluorescent tube detection5.1 Fluorescent tube modelIt is natural to think of fluorescent tube as a natural landmark for a vision-based process aimed at improving the localization of a mobile robot in an indoor environment. Indeed, problems such as dirt, shadows, light reflection on the ground, or obstruction of the landmarks usually do not appear in this case.One advantage of fluorescent tubes compared to other possible landmarks located on the ceiling is that once they are switched on, their recognition in an image can be performed with a very simple imageprocessing algorithm since they are the only bright elements that are permanently found in such a place.If a 256 grey levels image containing a fluorescent tube is binarized with an appropriate threshold 0 ≤T ≤255, the only element that remains after this operation is a rectangular shape. Fig.4.a shows a typical camera image of the ceiling of a corridor containing a fluorescent light. The axis of the camera is perpendicular to the ceiling. Shown in (b) is the binarized image of (a). If we suppose that the distance between the camera and the ceiling remains constant and that no more than one light at a time can be seen by the camera located on the top of the robot, a fluorescent tube can be modeled by a given area S0 in a thresholded image of the ceiling.Figure 4: (a) Sample image of a fluorescent light, (b) binarized image.5.2 Fluorescent light detection processUsing odometry, the robot is able to know when it gets close to a light recorded in its map by comparing in a close loop its actual estimated position to the different locations of the lights in the map. Once it gets close to one of them, it starts taking images of the ceiling and binarizing them with the threshold T used to memorize the corresponding light during the learning process.This step is repeated until the number N of pixels brighter than T becomes close to S0. When it happens to be true and when the binarized shape does not touch any border of the image, the detection algorithm is stopped and further image-processing is done as explained in the next section.In order to discard too bright images, the detection algorithm increases automatically the threshold. Moreover, because the intensity of the light emitted by fluorescent tubes changes with a frequency corresponding to the cycle of electric power, the threshold has to be decreased automatically if N ≤S0, so that the robot has a chance to detect the light located above it even if this one appears darker than usual.Figure 5: (a) Moment-based features of a fluorescent tube, (b) definition of the robot local reference.Figure 6: (a) Fluorescent light located on the border of the image. (a),(b) original and binarized images before distortion correction, (c),(d) after distortion correction6 Estimation of the absolute positionThe detection algorithm has been designed so that the pose of the fluorescent tube is calculated only if the whole shape of the tube appears in the captured image. Therefore it is possible to calculate the location of the tube’s centroid L in the image as well as the orientation of its least moment of inertia axis θL i using the moment-based features of the binarized shape Δrepresented in Fig.5.a.The above operations are evaluated on the image after its distortion has been corrected. Indeed, image distortion becomes a major issue whenever a light detected by the robot is not located in the center of the image. In this case, the error on the shape orientation can reach 5 degrees. Fig.6 shows an image taken by the robot when it is not exactly below the light and when its Yr axis defined in Fig5.b was parallel to the light’s main axis. The binarized shapes, before and after distortion correction are also given.Since the calculation of the vehicle’s absolute pose estimation from an image is time consuming, retroactive data fusion with odometry data is necessary[7]. This function is achieved thanks to an algorithm developed previously in our laboratory[8].Figure 7: The YAMABICO robot with its top board camera7 Implementation and experimentWe implemented this system on the Y AMABICO robot[9] developed in our laboratory and shown in Fig.7. The sensors used by the robot to estimate its pose are optical encoders mounted on the wheels and a board CCD black and white camera facing the ceiling. The navigation program and the absolute pose- estimation program based on fluorescent lights are implemented as independent modules. It is therefore possible to run simultaneously other pose estimation programs based on different landmarks and using different sensors without any modification of the existing vision-based module.The validity of the proposed navigation system has been shown by making experiments in the corridor shown in Fig.1 at different times of the day. The robot was first guided under each light in order to build the landmarks map. Then it used the map and a path defined above it to navigate in the middle of the corridor until a goal point. The maximum speed of the robot was 35 cm/s and total distance on one way was about 50 meters. On the robot’s path 24 fluorescent tubes of different shapes were present, separated by a distance. varying from 2.2 meters to 4.5 meters.The experimental results of one of those experiments are shown in Fig.8 where the bold line corresponds to the odometry data of the robot. The map used by the robot for this experiment and thecorresponding path are represented partially in Fig.3. When a light is found, the absolute pose of the robot is corrected after a certain delay represented by the distance between the marks ‘+’and ‘×’respectively. The table below gives average computing times for the different steps of the image-processing algorithm. All image-processing is done on board by a vision module developed in our laboratory which is equipped with a 20MHz Thomson T805 processor.For better understanding of the navigation process, the robot was asked in this experiment to enter the next segment on its trajectory one meter after passing the light corresponding to the end point of the previous segment. The point when the robot changes itstrajectory to enter a new segment is represented by a ‘◦’in Fig.8.Because the odometry errors that occur during the building of the map remain small for short distances, the relative angle between two successive segments used to specify the robot’s path remains acceptable (in average less than 2 degrees), even when the robot is far from the map’s origin. This interesting property could be observed during the experiments where the vehicle slightly moved away from the center line of the corridor when it entered a new segment, which lead to small pose correction when the next landmark was detected.8 Conclusions and future workIn this paper, we presented a complete navigation system that enables a mobile robot to achieve long distance indoor navigation thanks to lights located aboveits trajectory.In a first step, the robot builds in advance a map of these landmarks that can be detected easily. Once the map-building process is finished, the trajectory the vehicle has to follow is defined above the previous map thanks to a GUI in order to handle the errors included during the learning procedure. In the second step, the robot looks for the lights it has learnt and fuses its new estimated absolute pose with odometry whenever a landmark is detected during the navigation process.Experiments show that it is possible for the robot to navigate with precision on a long distance without any other position or orientation sensing system than optical encoders and a black and white camera pointing to the ceiling.The landmarks map building we presented needs at present a human operator to guide the robot during the learning process. We want to convert it to a fully autonomous map building system. Future work will also address how to extend the navigation system to several robots moving along corridors Because of the accumulation of odometry errors while the robot is guided for the first time throughout its environment, a light detected several times should not be recorded more than once in the map. Since the pose of the light computed by the robot in a global referential will be different whenever the vehicle re-encounters the same landmark during the learning process, further work involving loop trajectories management has to be done to cope with this issue.References[1] C. Ming Wang, “Location estimation and uncertainty analysis for mobile robots”, Proc. of 1988 International Conference on Robotics and Automation , pp 1230-1235.[2] S. Hashino, “Performance of a fluorescent light sensor”, Proc. of Robotics Society of Japan , 1994, pp 1077-1078, in Japanese.[3] N. Oota, T. Fukuda, F. Arai, Y. Abe, K. Tanaka, Y. Tanaka, “Landmark Tracking of Autonomous Mobile Robot with Vision System”, Proc. of Robotics Society of Japan ,1992, pp 935-936, in Japanese.[4] T. Fukuda, Y. Yokoyama, F. Arai, K. Shimojima, S. Ito, Y. Abe , K. Tanaka, S. Tanaka, “Navigation System based on Ceiling Landmark Recognition for Autonomous MobileRobot”, Proc. of 1996 International Conference on Robotics and Automation , pp 1720-1725.[5] M. Hashiba, T. Lillawa, T. Honma, S. Suzuki, “A Visual Navigation Method for Indoor Mobile Robots - A position Identification Method Using Development Images ofthe Route”, Proc. of Robotics Society of Japan , 1998, pp 1603-1604, in Japanese.[6] J. Amat, J. Fernandez, A. Casals, “Vision Based Navigation System for Autonomous Vehicles”, Proc. of 2000 International Conference on Intelligent Autonomous Systems , pp1049-1056.[7] A. Kosaka, M. Meng, A. Kak, “Vision-guided mobile robot navigation using retroactive updating of position uncertainty”, Proc. of 1993 International Conference onRobotics and Automation , pp 1-7.[8] S. Maeyama, A.Ohya, S.Yuta, “Non-stop outdoor navigation of a mobile robot - Retroactive positioning data fusion with a time consuming sensor system”, Proc. of 1995 IntelligentRobots and Systems , V ol.1, pp 130-135.[9] S. Yuta S. Suzuki, S. Iida, “Implementation of a small size experimental self-contained autonomous robot –sensors, vehicle control, and description of sensor based onbehavior–”, in Experimental Robotics II , R. Chatila et al., Eds. Berlin, Germany: Springer-Verlag, 1993, pp 344-358Intelligent Robot Laboratory, 2008,55(2):128-135萤光灯管检测室内移动机器人法钦思、爱丽丝斯塔智能机器人实验室,筑波大学摘要本文提出了一个室内导航系统自主移动机器人的教学环境。

智能机器人外文翻译

智能机器人外文翻译

RobotRobot is a type of mechantronics equipment which synthesizes the last research achievement of engine and precision engine, micro-electronics and computer, automation control and drive, sensor and message dispose and artificial intelligence and so on. With the development of economic and the demand for automation control, robot technology is developed quickly and all types of the robots products are come into being. The practicality use of robot products not only solves the problems which are difficult to operate for human being, but also advances the industrial automation program. At present, the research and development of robot involves several kinds of technology and the robot system configuration is so complex that the cost at large is high which to a certain extent limit the robot abroad use. To development economic practicality and high reliability robot system will be value to robot social application and economy development.With the rapid progress with the control economy and expanding of the modern cities, the let of sewage is increasing quickly: With the development of modern technology and the enhancement of consciousness about environment reserve, more and more people realized the importance and urgent of sewage disposal. Active bacteria method is an effective technique for sewage disposal,The lacunaris plastic is an effective basement for active bacteria adhesion for sewage disposal. The abundance requirement for lacunaris plastic makes it is a consequent for the plastic producing with automation and high productivity. Therefore, it is very necessary to design a manipulator that can automatically fulfill the plastic holding.With the analysis of the problems in the design of the plastic holding manipulator and synthesizing the robot research and development condition in recent years, a economic scheme is concluded on the basis of the analysis of mechanical configuration, transform system, drive device and control system and guided by the idea of the characteristic and complex of mechanical configuration, electronic, software and hardware. In this article, the mechanical configuration combines the character of direction coordinate and the arthrosis coordinate which can improve the stability and operation flexibility of the system. The main function of the transmission mechanism is to transmit power to implement department and complete the necessary movement. In this transmission structure, the screw transmission mechanism transmits the rotary motion into linear motion. Worm gear can give vary transmissionratio. Both of the transmission mechanisms have a characteristic of compact structure. The design of drive system often is limited by the environment condition and the factor of cost and technical lever. 'The step motor can receive digital signal directly and has the ability to response outer environment immediately and has no accumulation error, which often is used in driving system. In this driving system, open-loop control system is composed of stepping motor, which can satisfy the demand not only for control precision but also for the target of economic and practicality. on this basis, the analysis of stepping motor in power calculating and style selecting is also given.The analysis of kinematics and dynamics for object holding manipulator is given in completing the design of mechanical structure and drive system. Kinematics analysis is the basis of path programming and track control. The positive and reverse analysis of manipulator gives the relationship between manipulator space and drive sp ace in position and speed. The relationship between manipulator’s tip position and arthrosis angles is concluded by coordinate transform method. The geometry method is used in solving inverse kinematics problem and the result will provide theory evidence for control system. The f0unction of dynamics is to get the relationship between the movement and force and the target is to satisfy the demand of real time control. in this chamfer, Newton-Euripides method is used in analysis dynamic problem of the cleaning robot and the arthrosis force and torque are given which provide the foundation for step motor selecting and structure dynamic optimal ting.Control system is the key and core part of the object holding manipulator system design which will direct effect the reliability and practicality of the robot system in the division of configuration and control function and also will effect or limit the development cost and cycle. With the demand of the PCL-839 card, the PC computer which has a. tight structure and is easy to be extended is used as the principal computer cell and takes the function of system initialization, data operation and dispose, step motor drive and error diagnose and so on. A t the same time, the configuration structure features, task principles and the position function with high precision of the control card PCL-839 are analyzed. Hardware is the matter foundation of the control. System and the software is the spirit of the control system. The target of the software is to combine all the parts in optimizing style and to improve the efficiency and reliability of the control system. The software design of the object holding manipulator control system is divided into several blocks such assystem initialization block, data process block and error station detect and dispose model and so on. PCL-839 card can solve the communication between the main computer and the control cells and take the measure of reducing the influence of the outer signal to the control system.The start and stop frequency of the step motor is far lower than the maximum running frequency. In order to improve the efficiency of the step motor, the increase and decrease of the speed is must considered when the step motor running in high speed and start or stop with great acceleration. The increase and decrease of the motor’s speed can be controlled by the pulse frequency sent to the step motor drive with a rational method. This can be implemented either by hardware or by software. A step motor shift control method is proposed, which is simple to calculate, easy to realize and the theory means is straightforward. The motor' s acceleration can fit the torque-frequency curve properly with this method. And the amount of calculation load is less than the linear acceleration shift control method and the method which is based on the exponential rule to change speed. The method is tested by experiment.At last, the research content and the achievement are sum up and the problems and shortages in main the content are also listed. The development and application of robot in the future is expected.机器人机器人是典型的机电一体化装置,它综合运用了机械与精密机械、微电子与计算机、自动控制与驱动、传感器与信息处理以及人工智能等多学科的最新研究成果,随着经济的发展和各行各业对自动化程度要求的提高,机器人技术得到了迅速发展,出现了各种各样的机器人产品。

机器人外文翻译(文献翻译_中英文翻译)

机器人外文翻译(文献翻译_中英文翻译)

外文翻译外文资料:RobotsFirst, I explain the background robots, robot technology development. It should be said it is a common scientific and technological development of a comprehensive results, for the socio-economic development of a significant impact on a science and technology. It attributed the development of all countries in the Second World War to strengthen the economic input on strengthening the country's economic development. But they also demand the development of the productive forces the inevitable result of human development itself is the inevitable result then with the development of humanity, people constantly discuss the natural process, in understanding and reconstructing the natural process, people need to be able to liberate a slave. So this is the slave people to be able to replace the complex and engaged in heavy manual labor, People do not realize right up to the world's understanding and transformation of this technology as well as people in the development process of an objective need. Robots are three stages of development, in other words, we are accustomed to regarding robots are divided into three categories. is a first-generation robots, also known as teach-type robot, it is through a computer, to control over one of a mechanical degrees of freedom Through teaching and information stored procedures, working hours to read out information, and then issued a directive so the robot can repeat according to the people at that time said the results show this kind of movement again, For example, the car spot welding robots, only to put this spot welding process, after teaching, and it is always a repeat of a work It has the external environment is no perception that the force manipulation of the size of the work piece there does not exist, welding 0S It does not know, then this fact from the first generation robot, it will exist this shortcoming, it in the 20th century, the late 1970s, people started to study the second-generation robot, called Robot with the feeling that This feeling with the robot is similar in function of a certain feeling, forinstance, force and touch, slipping, visual, hearing and who is analogous to that with all kinds of feelings, say in a robot grasping objects, In fact, it can be the size of feeling out, it can through visual, to be able to feel and identify its shape, size, color Grasping an egg, it adopted a acumen, aware of its power and the size of the slide. Third-generation robots, we were a robotics ideal pursued by the most advanced stage, called intelligent robots, So long as tell it what to do, not how to tell it to do, it will be able to complete the campaign, thinking and perception of this man-machine communication function and function Well, this current development or relative is in a smart part of the concept and meaning But the real significance of the integrity of this intelligent robot did not actually exist, but as we continued the development of science and technology, the concept of intelligent increasingly rich, it grows ever wider connotations.Now, I would like to briefly outline some of the industrial robot situation. So far, the industrial robot is the most mature and widely used category of a robot, now the world's total sales of 1.1 million Taiwan, which is the 1999 statistics, however, 1.1 million in Taiwan have been using the equipment is 75 million, this volume is not small. Overall, the Japanese industrial robots in this one, is the first of the robots to become the Kingdom, the United States have developed rapidly. Newly installed in several areas of Taiwan, which already exceeds Japan, China has only just begun to enter the stage of industrialization, has developed a variety of industrial robot prototype and small batch has been used in production.Spot welding robot is the auto production line, improve production efficiency and raise the quality of welding car, reduce the labor intensity of a robot. It is characterized by two pairs of robots for spot welding of steel plate, bearing a great need for the welding tongs, general in dozens of kilograms or more, then its speed in meters per second a 5-2 meter of such high-speed movement. So it is generally five to six degrees of freedom, load 30 to 120 kilograms, the great space, probably expected that the work of a spherical space, a high velocity, the concept of freedom, that is to say, Movement is relatively independent of the number of components, the equivalent of our body, waist is a rotary degree of freedom We have to be able to hold his arm, Arm can be bent, then this three degrees of freedom, Meanwhile there is a wristposture adjustment to the use of the three autonomy, the general robot has six degrees of freedom. We will be able to space the three locations, three postures, the robot fully achieved, and of course we have less than six degrees of freedom. Have more than six degrees of freedom robot, in different occasions the need to configure.The second category of service robots, with the development of industrialization, especially in the past decade, Robot development in the areas of application are continuously expanding, and now a very important characteristic, as we all know, Robot has gradually shifted from manufacturing to non-manufacturing and service industries, we are talking about the car manufacturer belonging to the manufacturing industry, However, the services sector including cleaning, refueling, rescue, rescue, relief, etc. These belong to the non-manufacturing industries and service industries, so here is compared with the industrial robot, it is a very important difference. It is primarily a mobile platform, it can move to sports, there are some arms operate, also installed some as a force sensor and visual sensors, ultrasonic ranging sensors, etc. It’s surrounding environment for the conduct of identification, to determine its campaign to complete some work, this is service robot’s one of the basic characteristics.For example, domestic robot is mainly embodied in the example of some of the carpets and flooring it to the regular cleaning and vacuuming. The robot it is very meaningful, it has sensors, it can furniture and people can identify, It automatically according to a law put to the ground under the road all cleaned up. This is also the home of some robot performance.The medical robots, nearly five years of relatively rapid development of new application areas. If people in the course of an operation, doctors surgery, is a fatigue, and the other manually operated accuracy is limited. Some universities in Germany, which, facing the spine, lumbar disc disease, the identification, can automatically use the robot-aided positioning, operation and surgery Like the United States have been more than 1,000 cases of human eyeball robot surgery, the robot, also including remote-controlled approach, the right of such gastrointestinal surgery, we see on the television inside. a manipulator, about the thickness fingers such a manipulator, inserted through the abdominal viscera, people on the screen operating the machines hand, it also used the method of laser lesion laser treatment, this is the case, peoplewould not have a very big damage to the human body.In reality, this right as a human liberation is a very good robots, medical robots it is very complex, while it is fully automated to complete all the work, there are difficulties, and generally are people to participate. This is America, the development of such a surgery Lin Bai an example, through the screen, through a remote control operator to control another manipulator, through the realization of the right abdominal surgery A few years ago our country the exhibition, the United States has been successful in achieving the right to the heart valve surgery and bypass surgery. This robot has in the area, caused a great sensation, but also, AESOP's surgical robot, In fact, it through some equipment to some of the lesions inspections, through a manipulator can be achieved on some parts of the operation Also including remotely operated manipulator, and many doctors are able to participate in the robot under surgery Robot doctor to include doctors with pliers, tweezers or a knife to replace the nurses, while lighting automatically to the doctor's movements linked, the doctor hands off, lighting went off, This is very good, a doctor's assistant.Robot is mankind's right-hand man; friendly coexistence can be a reliable friend. In future, we will see and there will be a robot space inside, as a mutual aide and friend. Robots will create the jobs issue. We believe that there would not be a "robot appointment of workers being laid off" situation, because people with the development of society, In fact the people from the heavy physical and dangerous environment liberated, so that people have a better position to work, to create a better spiritual wealth and cultural wealth.译文资料:机器人首先我介绍一下机器人产生的背景,机器人技术的发展,它应该说是一个科学技术发展共同的一个综合性的结果,同时,为社会经济发展产生了一个重大影响的一门科学技术,它的发展归功于在第二次世界大战中各国加强了经济的投入,就加强了本国的经济的发展。

机械制造及自动化专业外文翻译--X射线实时影象探伤管道机器人的关键技术

机械制造及自动化专业外文翻译--X射线实时影象探伤管道机器人的关键技术

外文原文:Key Techniques of the X2ray Inspection Real-timeImaging Pipeline RobotThis paper presents a robotic system for weld-joint inspection of the big-caliber pipeline , which is developed for the purpose of being utilized as automation platform for X-ray real-time imaging inspection technique (RTIIT) . The robot can perform autonomous seeking and locating of weld-seam position in-pipe , and under the control of synchro-follow control technique it can accomplish the technologic task of weld inspection. The robotic system is equipped with a small focal spot and directional beam X-ray tube ,so the higher definition image of weld-seam can be obtained.Several key techniques about the robotic system developed are also explained in detail . Its construction is outlined.Key words : X-ray inspection ;real-time imaging ;robot0 IntroductionCompared with radiographic examination technique(RET) , X-ray real time imaging inspection technique(RTIIT) has many advantages such as higher efficiency ,lower cost , better feasible automation and weld-defects evaluation on-line. Furthermore , up to date technology allows the X-ray RTIIT to be used in Non-Destructive Testing (NDT) of pipelines , and the inspection quality of this Technique is as good as that of the RET[1 ,2 ] . Therefore ,NDT equipments , which are used commonly in pipeline inspection and basing on the RET , need to be renovated by basing on the X-ray RTIIT.To employ the X-ray RTIIT in NDT of pipeline there must be an automation platform , and X-ray inspection real-time imaging pipeline robot ( IRTIPR) is designed for the purpose. In fact , besides the problems that have been resolved[3 ] and are involved in the X-ray IRTIPR , several key techniques are presented in this paper , in which we address the robot focusing on its intelligent control, i . e.the autonomous motion in-pipe , the synchro-follow controltechnique and the communication ofcooperation between in-pipe and out-pipe , and we also outline the construction of the robot .1 Composing and Working Principle of the RobotThe X-ray IRTIPR consists of the two parts of in-pipe and out-pipe , as illustrated in Figl 1. The out-pipe part is composed of image collecting and processing system (8 ,9 ,10) , out-pipe synchro-rotary mechanism and its driving system (11 ,12) . The image intensifier is driven by the out-pipe rotary mechanism to rotate round the center of pipeline to collect weld image and transmit video signal to image processing computer by image-collecting card. The in-pipe part is composed of in-pipe computer (1) , power and inverters system (2) , walking and driving system (3) , X-ray system (4) , in-pipe synchro-rotary mechanism and its driving system (5 ,6) and weld-seam autonomous seeking and locating system (7) . The X-ray tube in X-ray system is driven by the in-pipe rotary mechanism to rotate round the center of pipeline.Fig.1 The structure of X-ray IRTIPRThe main working principle of the robot is explained as follows : Under the control of weld-seam autonomous seeking and locating system the in-pipe crawler finishes the localization of working position , at which the in-pipe crawler is in a state of waiting. When it receives the command signal from out-pipe , which is transmitted by low frequency electromagnetic wave , the in-pipe computer operates immediately the controller of X-ray system to realize its out-pipe control . In sequence the in-pipeand out-piperotary mechanisms are controlled by the synchro-followcontrol technique to rotate with the same center of pipeline and finish weld-seam inspection in the manner of rotating-irradiating-rotating.2 The Control System of the RobotAccording to the technologic process of working principle , the control system of X-ray IRTIPR is proposed and mainly made up of several key techniques such as the synchro-follow control technique based on the X-ray image of benchmark lead wire , the weld-seam autonomous seeking and locating technique based on data fusion and the communication of low frequency electromagnetic wave.2. 1 The Synchro-follow Control Technique of In-pipe and Out-pipe Rotary MechanismIn the light of the technologic requirement of X-ray RTIIT , the X-ray tube and the image intensifier must be required to rotate synchronously with the same center. Because the X-ray IRTIPR adopts wireless working manner , i . e. there is no tether cables linking in-pipe with out-pipe parts of the robot . How to realize the synchro-message communication between in-pipe and out-pipe control systems of rotary mechanism , or how to realize synchro-control , then becomes a key technique that must be solved.The synchro-follow rotating can be described as : when the in-pipe rotary mechanism drives X-ray tube to rotate an angle of α, the out-pipe rotary mechanism drives image intensifier to rotate the same angle synchronously with the same center too (Fig12) . Because of the shielding function of metal pipeline and wireless feature , the means of communication existed is difficult to accomplish control-message communication between in-pipe and out-pipe parts[4 ,5 ] . According to the particularity of X-ray IRTIPR , we put forward the synchro-control scheme as follows : a benchmark lead wire perpendicular to weld-seam is placed on the irradiation window of X-ray tube ; when the weld-seam is irradiated by X-ray , the benchmark lead wire is also imaged in out-pipe computer. As long as the in-pipe and out-pipe rotary mechanisms are in a synchronous position , namely the axis of irradiation window of X-ray tube is coincident with that of image intensifier (α= 0)(Fig12) , the image of benchmark lead wire is in the middle position of computer’s screen , i . e. the image of benchmark lead wire is coincident with the position of benchmark center-line ( H = 0) , see Fig13. When in-pipe rotary mechanism rotates an angle of α, the image of benchmark lead wire will deviate from benchmark center line on the screen , the distance is H. Then the distance H is used as an error input of control system of out-pipe rotary mechanism to regulate its rotating motion. Until the distance H is zero or smaller than appointed value , the synchro-follow motion of out-pipe rotary mechanism can be realized.The test and simulation prove that the above-mentioned synchro-follow control technique is correct . The synchro-motion satisfies the technologic requirement of X-ray RTIIT.The method utilizes X-ray as vision source , and the synchro-motion message of in-pipe and out-pipe rotary mechanisms is transmitted by the screen’s distance that the X-ray image of benchmark lead wire deviates from the benchmark center line , thus the synchro-motion is performed. The method has been applied for invention patent .Fig12 The synchro2rotary mechanism Fig13 The X2ray image of benchmark lead wire2. 2 Weld-seam Autonomous Seeking and Locating TechniqueAutonomous seeking and locating mean that the robot determines automatically where is the working position in-pipe with the help of sensors but without any one’s inter-meddling. This control-manner is actually“intelligent”.The precision and reliability of seeking and locating a system have direct relation with if a robot can realize autonomous motion in-pipe. If this system is disabled , the robot will take theplace of the accident of“death”or“lose the way”in-pipe[6 ] .Generally , methods for detecting the position of weld-seam are as follows : (1) Utilize encoder or cyclometer ; (2) Utilize the displacement caused by the protrusion-concave changing of weld-seam surface ; (3) Utilize if the zone of weld-seam conducts electricity ; (4) Utilize radioactive isotope ( such as γray source) ;(5) Utilize vision ; (6) Utilize low frequency electromagnetic wave.Because these methods are influenced by many factors such as walking wheel’s skid , the in-pipe environment , manmade factors , radioactive injury , locating precision and efficiency , satisfactory result can’t be obtained when one of the methods is used alone.Considering weld-seam regular array , i . e. the space between each weld-seam is about 12m , and advantages and disadvantages of each position-detection method , one system of weld-seam autonomous seeking and locating based on multi-sensors is put forward to improve and enhance the precision , efficiency and reliability of localization. Multi-sensors consist of the cyclometer , CCD camera and the receiver and emitter of low frequency electromagnetic wave. Systematic block diagram is depicted as Fig14.Fig14 Weld-seam autonomous seeking and locating systemThe system adopts position feedback for enhancing the efficiency of localization. Vision servo is structured with image given feedback for realizing accurate localization.The data fusion based on three kinds of measure-data , which are the data of cyclometer , low frequency electromagnetic wave and vision , adopts theestimate-algorithm with priority to process data. In terms of the characteristics of three localization methods , the above data have different effective function region respectively. If X1 represents the measure-data of cyclometer , X2 of low frequency electromagnetic wave , X3 of vision. X represents the actual position in-pipe of the robot , the space between each weld-seam is 12m. Then , the effective functionregions of three kinds of measure-data are as follows respectively : X1 ∈[ 1m ,12m] ; X2 ∈[ 0. 1m ,1m] ; X3 ∈[ -10cm ,10cm ] , the final localization goal is X3 = 0. The data fusion’s rule of three kinds of measure-data is described as : when the distanceX1 away from weld-seam position is greater than 100cm , the cyclometer is employed for localization in order to enhance the efficiency , and let the in-pipe crawler move at a high speed ; when the data X2 is smaller than 100cm , the“attention”of the controller changes into the method of low frequency electromagnetic wave , and let the in-pipe crawler move at alow speed ; when the weld-seam enters the vision range , the vision servo is adopted for accurate localization.The data fusion’s rules are expressed as :X = X1if ( X3 > - 10) and ( X3 < 10) , then X = X3 ;The above-mentioned method that is realized with fuzzy control and datafusion has perfectly solved the contradiction between the precision and the efficiency of localization. The test result of localization precision is within ≤±3 mm , which can meet the design requirement .2.3 The Communication of Low Frequency Electromagnetic WaveBesides the function of localization , low frequency electromagnetic wave is still utilized to transmit the off-on signal between in-pipe and out-pipe parts. Considering its dangers , the X-ray system is often operated with remote control from out-pipe. Because the robot is wireless and in view of the shielding function of metal pipeline , other methods cannot accomplish the mission that transmits the off-on signal between in-pipe and out-pipe parts. So the low frequency electromagnetic wave is adopted to transmit operation command for in-pipe computer to control the X-ray system.3 ConclusionKey techniques of the X-ray IRTIPR are assurances for X-ray RTIIT to realize automation. If a robot adopts the working means of having no cable and the synchro-follow control technique of in-pipe and out-pipe rotary mechanisms being not solved , it will be impossible for the X-ray RTIIT to realize automation at all . The weld-seam autonomous seeking and locating technique is a concrete embodiment of“intelligence”for the robot , and is also an assurance for the robot to work with high reliability. Low frequency electromagnetic wave realizes communication between a control system’s in-pipe and out-pipe parts under the condition of metal pipe’s shielding , and plays the role of closed loop of control system. The X-ray IRTIPRbased on these key techniques can be used in the inspection of the big-caliber pipeline ( at « 660 ~« 1400mm) , whose working distance is about 2km without charging and working speed is at 18m/ min. Because the robot is equipped with a small focal spot and directional beam X-ray tube , an image of weld-seam with higher definition can be obtained compared with other kinds of X-ray tube.These key techniques are proved in test and meet perfectly the technologic requirements of X-ray RTIIT.References[1 ] Zeng X Z, Sun Z C. Nondestructive Testing , 2001 , 12(12) : 530[2 ] Zheng S C. Nondestructive Testing , 2000 , 22(7) : 328[3 ] Jiang S Y. Research on in2pipe X2ray inspection robot technol2ogy and theory : [ dissertation ] . Harbin : Harbin Institute ofTechnology , 2001[4 ] Blettner A , Chauveau D , Becker. Robot for computerized realtime radiographic inspection of on shore pipe welds. In : 6thEuropean Conference on NonDestructive Testing. Nice ,France : 1994. 225[5 ] Anon. Welding in the world , 1990 , 28(5) :77[6 ] Bjorkholm P J , Parker R , Johnson M. Design and applicationof a digital radiographic weld inspection system. In : Span An2tonio , eds. 1990 ASNT Spring Conference. Texas , UnitedStates : 1990. 187中文译文:X射线实时影象探伤管道机器人的关键技术摘要这篇论文介绍了一种检查大口径管道焊接连接的机器人系统,它被发展作为X射线实时图象检查法 [RTIIT]的自动化平台。

  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

河南理工大学万方科技学院毕业设计外文翻译
本科毕业设计英语论文
院(系):
专业:
班级:
姓名:
学号:
2011年6月10日
中文翻译:
简单结构管道检测机器人
本文介绍了管道检测机器人的原始架构。

该机器人由包含两个万向节铰接部分。

一部分是沿管道平行移动的轴车轮盘,而另一部分则是被迫遵循与螺旋运动有关的轴线旋转的车轮倾斜管。

单台电动机被放置在两个机构之间产生的运动。

所有车轮安装在悬架上,以适应不断变化的曲线管道。

该机器人有其自己的电池和无线链路。

四种不同管径分别为的170,70和40毫米。

对于较小的直径,电池,无线电接收器,可放置在其他额外的机构中。

这种架构非常简单,其旋转运动可以被利用来进行擦洗或检验任务。

关键词:自主移动机器人,在管道检测,螺旋运动
管道检测机器人已经被研究了很长一段时间,许多原来的运动观念被提出来解决在管道直径,曲线和能源供应变化中有关的许多技术困难。

虽然一个详尽的文献回顾是不可能的,根据有限的可用空间,几大类别,可确定几大类别:1对于小规模,许多项目遵循蚯蚓原则:中央部分组成轴向移动,而两端连接的设备具有阻隔管道。

这一概念已经提出气动版本(如[1]),
但他们需要电力脐带。

对于较小的直径(10毫米或更小),根据尺蠖的原则,或根据惯性运动由锯齿波电压驱动[2],或使用与微分摩擦系数振动鳍[3],则采用压电驱动。

2对于各种中型管道,根据直径的适应性和转弯能力古典机电系统已提出各种涉及车轮和轨道运动学结构。

3对于大型管道,管道爬行走路也已提出[6]。

文中提出的四个移动机器人属于第二类,他们的管直径从40到170mm,该设计尝试使用单一驱动器减少机器的复杂性实现沿管的的流动性,即使我们的研究可看作一个独立努力地结果。

但此螺旋论似乎已经被研究过。

体系结构
图1
该机器人主要分为两部分,定子和转子,包括一个DC 连接,马达与减速机,万向节。

定子配备了一套轮子,有助于运动平行; 在这种情形下,定子约束沿着管轴,而转子的车轮只能沿着螺旋轨迹,该
机器人之间的轴向速度和旋转速度的关系 αtg R w v ••=
其中R 为管道半径, a 是车轮倾斜角度。

定子和转子必须保证稳定性,以保证机器人之间的管道和足够的接触力,以适应管道直径的变化和障碍,并允许在弯曲的管道中行驶。

对于更大型的机器人(D -170),机器人是硬性连接到电机轴上来确保稳定性。

对于规模较小的直径,弯管需要更大程度的自由,需要两倍的数量的轮子。

图2
图3
直径70毫米以上的机器人提供了9个电池,分布在电机定子上。

试验表明,对于规模较小的直径,这个配置是不可能的。

第一个由转子组成,第二个包括电机和减速器,第三个是定子车轮与轴,能源供应和电信网络。

两个方案进行了70mm 直径的调查。

在第一个中,马达和电池安装在定子上,但却无法用在第二个方案中,如果机器人使用缆索电源。

相关文档
最新文档