freepeople性欧美熟妇, 色戒完整版无删减158分钟hd, 无码精品国产vα在线观看DVD, 丰满少妇伦精品无码专区在线观看,艾栗栗与纹身男宾馆3p50分钟,国产AV片在线观看,黑人与美女高潮,18岁女RAPPERDISSSUBS,国产手机在机看影片

正文內(nèi)容

機(jī)械設(shè)計(jì)制造及其自動(dòng)化大專論文(已改無(wú)錯(cuò)字)

2022-07-26 01:00:49 本頁(yè)面
  

【正文】 of interaction between realtime vision systems capable of tracking moving objects in 3D and a robot arm equipped with a dexterous hand that can be used to intercept, grasp, and pick up a moving object. We are interested in exploring the interplay of handeye coordination for dynamic grasping tasks such as grasping of parts on a moving conveyor system, assembly of articulated parts, or for grasping from a mobile robotic system. Coordination between an anism39。s sensing modalities and motor control system is a hallmark of intelligent behavior, and we are pursuing the goal of building an integrated sensing and actuation system that can operate in dynamic as opposed to static environments.There has been much research in robotics over the last few years that address either visual tracking of moving objects or generalized grasping problems. However, there have been few efforts that try to link the two problems. It is quite clear that plex robotic tasks such as automated assembly will need to have integrated systems that use visual feedback to plan, execute, and monitor grasping.The system we have built addresses three distinct problems in robotic handeye coordination for grasping moving objects: fast putation of 3D motion parameters from vision, predictive control of a moving robotic arm to track a moving object, and interception and grasping. The system is able to operate at approximately human arm movement rates, using visual feedback to track, intercept, stably grasp, and pick up a moving object. The algorithms we have developed that relate sensing to actuation are quite general and applicable to a variety of plex robotic tasks that require visual feedback for arm and hand control.Our work also addresses a very fundamental and limiting problem that is inherent in building integrated sensing actuation systems。 integration of systems with different sampling and processing rates. Most plex robotic systems are actually amalgams of different processing devices, connected by a variety of methods. For example, our system consists of three separate putation systems: a parallel image processing puter。 a host puter that filters, triangulates, and predicts 3D position from the raw vision data。 and a separate arm control system puter that performs inverse kinematic transformations and jointlevel servicing. Each of these systems has its own sampling rate, noise characteristics, and processing delays, which need to be integrated to achieve smooth and stable realtime performance. In our case, this involves overing visual processing noise and delays with a predictive filter based upon a probabilistic analysis of the system noise characteristics. In addition, realtime arm control needs to be able to operate at fast servo rates regardless of whether new predictions of object position are available. The system consists of two fixed cameras that can image a scene containing a moving object (Fig. 1). A PUMA560 with a parallel jaw gripper attached is used to track and pick up the object as it moves (Fig. 2). The system operates as follows:1) The imaging system performs a stereoscopic opticflow calculation at each pixel in the image. From these opticflow fields, a motion energy profile is obtained that forms the basis for a triangulation that can recover the 3D position of a moving object at video rates.2) The 3D position of the moving object puted by step 1 is initially smoothed to remove sensor noise, and a nonlinear filter is used to recover the correct trajectory parameters which can be used for forward prediction, and the updated position is sent to the trajectoryplanner/armcontrol system.3) The trajectory planner updates the jointlevel servos of the arm via kinematic transform equations. An additional fixedgain filter is used to provide servolevel control in case of missed or delayed munication from the vision and filtering system.4) Once tracking is stable, the system mands the arm to intercept the moving object and the hand is used to grasp the object stably and pick it up.The following sections of the paper describe each of these subsystems in detail along with experimental results.П. PREVIOUS WORKPrevious efforts in the areas of motion tracking and realtime control are too numerous to exhaustively list here. We instead list some notable efforts that have inspired us to use similar approaches. Burt et al. [9] have focused on highspeed feature detection and hierarchical scaling of images in order to meet the realtime demands of surveillance and other robotic applications. Related work has been reported by. Lee and Wohn [29] and Wiklund and Granlund [43] who uses image differencing methods to track motion. Corke, Paul, and Wohn [13] report a featurebased tracking method that uses specialpurpose hardware to drive a servo controller of an armmounted camera. Goldenberg et al. [16] have developed a method that uses temporal filtering with vision hardware similar to our own. Luo, Mullen, and Wessel [30] report a realtime implementation of motion tracking in 1D based on Horn and Schunk’s method. Vergheseetul. [41] Report realtime shortrange visual tracking of objects using a pipelined system similar to our own. Safadi [37] uses a tracking filter similar to our own and a pyramidbased vision system, but few results are reported with this system. Rao and DurrantWhyte [36] have implemented a Kalman filterbased decentralized tracking system that tracks moving objects with multiple cameras. Miller [31] has integrated a camera and arm for a tracking task where the emphasis is on learning kinematic and control parameters of the system. Weiss et al. [42] also use visual feedback to develop control laws for manipulation. Brown [8] has implemented a gaze control system that links a robotic “head” containing binocular cameras with a servo controlle
點(diǎn)擊復(fù)制文檔內(nèi)容
試題試卷相關(guān)推薦
文庫(kù)吧 www.dybbs8.com
備案圖片鄂ICP備17016276號(hào)-1