VIDEO TRACKING CONTROL ALGORITHMS FOR UNMANNED AIR VEHICLES Ricardo Bencatel Department of Electrical and Computer Engineering Faculty of Engineering of University of Porto Porto, 4200-465 – Portugal [email protected] João Borges Sousa Department of Electrical and Computer Engineering Faculty of Engineering of University of Porto Porto, 4200-465 – Portugal [email protected]

Gil M. Gonçalves Department of Informatics Engineering Faculty of Engineering of University of Porto Porto, 4200-465 – Portugal [email protected]

ABSTRACT We present and discuss a modular flight control system suitable for video tracking natural structures with Unmanned Air Vehicles (UAVs), like rivers, roads or canals. The control system is modeled in the framework of hybrid automata, where each state corresponds to different control algorithms. We implemented four nonlinear turn-rate control algorithms and compared them in a simulation environment to assess tracking performance. NOMENCLATURE V Airspeed

α β

Angle of attack

Drift angle p, q, r Angular rate on the aircraft reference frame

φ, θ, ψ Aircraft Euler angles x, y, z Positions in the fixed local reference frame Vg Linear speed in the earth fixed reference frame

χ

Aircraft course W Wind speed

ψw Wind heading ψs Structure heading on the look-ahead point rcmd Turn-rate command

João Correia Department of Electrical and Computer Engineering Faculty of Engineering of University of Porto Porto, 4200-465 – Portugal [email protected] Elói Pereira Portuguese Air Force Academy Sintra, 2715-021 Portugal [email protected]

INTRODUCTION We are developing a control system to command an Unmanned Air Vehicle (UAV) to track a visible border, such as a river or a road, or to inspect power lines or canals [1]. The goal is to track those structures with UAVs equipped with a commercial autopilot, a computer system and video systems. These UAVs are being developed in the context of a collaborative project involving Porto University and the Portuguese Air Force Academy. The technical characteristics of our UAV systems are presented in Section “UAV systems”. The video tracking system is now part of the Apollo command and control framework developed by the AsasF project from Porto University [2]. The a priori knowledge about the feature to track is provided by digital maps and/or Geographic information systems (GIS). This information is used to set a searching area for the UAV. In our concepts of operation the tracking algorithm receives the features to be tracked from the video system and generates turn-rate commands to be sent to the autopilot. Rivers and roads are structures with varying curvature, which may be discontinuous. Most tracking controllers [6, 8] were developed to follow strait lines or circles. We are concerned with extending this work to more general geometric shapes. We implemented a curve tracker [7] and adapted two line trackers [6, 8], as described on Section “Flight control”.

The main achievements of the work reported in this paper consist of the implementation and evaluation of four tracking controllers. The results are presented in “Simulation results” section. We present the conclusions and discuss future research in the last section.

characterization block outputs a geometric characterization of the structure to track. The flight control block uses the geometry of the structure to set the best tracking trajectory.

UAV SYSTEMS The Portuguese Air Force Academy and Porto University have been developing and operating four types of UAVs with wingspans ranging from 2 to 6 m [3]. These UAVs are stabilized and controlled with the help of a Piccolo autopilot [4]. With the low level control taken care of, the structure tracker outputs flight commands (turn rates) to the autopilot. Figure 1 depicts the main blocks of the UAV video tracking system.

Equations of motion The UAV has six degrees of freedom and UAV states are: {V, α, β, p, q, r, φ, θ, ψ, x, y, z, Vgx, Vgy, Vgz}

Tracked structure Aircraft position

Video processing

Structure characterization

V, x, y, ψ, θ, φ

Autopilot

Turn-rate reference

Flight control

FIGURE 1. VIDEO TRACKING SYSTEM

FLIGHT CONTROL

V, α and β set the airspeed vector magnitude and orientation, taking the aircraft reference frame. φ, θ and ψ are the Euler angles (roll, pitch and yaw) on an earth fixed reference frame and p, q and r the corresponding rates. Vgx, Vgy and Vgz represent the aircraft 3 axis velocities on an earth fixed reference frame. The rates of roll, pitch and yaw are directly controlled by the ailerons, elevators and rudder, respectively. The aircraft speed and rate of change of altitude change rate are controlled directly by the engine power. The other states are not controlled directly and depend on the wind conditions. The autopilot deals with most of the dynamics freeing us to concentrate on the trajectory control. We can assume that the system is characterized by the states {V, r, φ, ψ, x, y, Vgx, Vgy} which are related on Eq. (1):

x& = V ⋅ cos(ψ ) + W x = V g ⋅ cos(χ ) = V g x

Video processing and flight control run onboard the UAV PC104 computer system. The available computational power is constrained by the space and power limitations imposed by the small size of the UAV. The performance of the video processing software is compatible with the real time requirements of flight control.

y& = V ⋅ sin (ψ ) + W y = V g ⋅ sin (χ ) = V g y

ψ& = ω =

(1)

g ⋅ tan (φ ) V

The tracked structure is represented by a 2D spline curve in x and y. The video processing algorithm outputs points of the target structure and a spline is fitted to these points with the structure characterization algorithm.

FIGURE 2. UAV SYSTEM ARCHITECTURE

The video camera is rigidly attached to the aircraft body (Fig. 2). This mechanical configuration makes the video image highly dependent on the flight attitude. Hence, the video camera is required to acquire image in favorable conditions to facilitate the separation between targeted structures and the surrounding environments. The video tracking system is divided into three main blocks: video processing, structure characterization and flight control. The video processing block separates the target from the surrounding environment. For this the OpenCV library was used. The structure

Our goal is to maximize the area covered by the video camera along the geometric representation of the structure. In order to do this, the UAV should be kept as close as possible to the horizontal while minimizing the lateral error between the centers of the image and of the structure. To evaluate the tracking process we use the minimum Euclidian distance from the target path to aircraft centre, designated tracking error, or to the image centre, designated image tracking error, the UAV bank angle (φ) and the controller’ saturation level. Hybrid model The flight control system depicted in Fig. 3 is modeled as a 6 state hybrid automata [5], as the one presented by Rathinam et al. [7]. The discrete states are: •

Stand by in which the UAV will loiter over a predefined position awaiting for execution commands;

• • • • •

Search that basically implements a spiral search pattern with bounds defined by the operator; Align is a maneuver designed to get the UAV to align with the structure; GPS tracking is the transition state relying on GPS data for navigation while more video data is gathered; Video tracking uses only the feedback from the video processing system to maneuver the UAV; Safe mode is a state that can be activated in any of the above states by an operator input or by a system error, where the UAV is commanded to loiter on a predefined position. Stand by Search

“Lost structure” time-out

Align

Video tracking

Aircraft control: Guidance control We now describe the tracking algorithm. In the development of this work we adapted, created and tested several tracking controllers. Here we present the results from the four best algorithms (in terms of performance and distinctive behavior). Trigonometric controllers. We developed two adaptations of the waypoint tracker developed for the Aerosonde UAV [6]. These are based on the geometric relationship between the line tangent to the structure on the look-ahead point and the ground speed vector as illustrated on Fig. 5. The look-ahead point is used to define the 2D line over which we want the aircraft to align. Then we define a temporary target point on that line to get the commanded course.

Safe mode

GPS tracking

FIGURE 3. FLIGHT CONTROL DISCRETE STATES

The system will start on the Stand by state. The operator commands the start of the Search procedure. The Align state will be activated upon detection of the structure and will command the transition to the GPS tracking at the end of the align maneuver. The system goes to Video tracking when the video data reliability assessment is positive. Reference determination: Structure Look-ahead point Initially we considered two methods to define the look-ahead point on the structure. In the first one, we defined a look-ahead distance in the aircraft reference frame and project it on the curve representing the structure (Fig 4a). The other method consists in first projecting the aircraft position in the image predominant axis; then we define a look-ahead distance from this projection and project this distance on the curve (Fig 4b). The first method is conceptually simpler, but it requires solving implicit equations to compute the lateral error and the projection point. The second method requires two reference frame transformations and ensures that the initial desired direction will be followed, even if for some reason the aircraft moves in the wrong directions. We choose the second method for this reason.

FIGURE 5. TRIGONOMETRIC CONTROLLER VARIABLES

We can define the velocity vector and the relative position vector (aircraft to target point), in the tracking reference frame, i.e., with x axis tangent to the structure in the look-ahead point. Therefore, the kinematic model can be described by Eq. (2), or by Eq. (3), if we want to use directly the ground speed data.

Vx track (t ) = V cos(ψ (t ) − ψ s ) + W cos(ψ (t ) − ψ s )  V y track (t ) = −V sin (ψ (t ) − ψ s ) − W sin (ψ (t ) − ψ s )  ψ& (t ) = rcmd (t )

(2)

V x track (t ) = V g x cos(ψ s ) + V g y sin (ψ s )  V y track (t ) = V g x sin (ψ s ) − V g y cos(ψ s )  ψ& (t ) = rcmd (t )

(3)

The goal of this algorithm is to point the aircraft to the temporary target point E. To achieve that, the ABC triangle has to become similar to the ADE triangle. That only happens if:

xT arg et a)

b)

FIGURE 4. LOOK-AHEAD POINT DETERMINATION

yTrack

=

Vx Vy

(4)

From this expression we can derive the error expression,

E = xT arg et ⋅ V y − yTrack ⋅ Vx

assumes the form of the first row of Eq. (9). From Eq. (9) we can define α and β , as presented on Eq. (10). (5)

And a proportional controller to this error is

ψ& = Kψ (xTarget ⋅ V y − yTrack ⋅ Vx )

 yc = α ⋅ xc3 + β ⋅ xc2   yc′ = 3α ⋅ xc2 + 2β ⋅ xc

(9)

yc yc′  α = −2 3 + 2 xc xc   β = 3 yc − yc′  xc2 xc 

(10)

(6)

The right side of Eq. (6) has a user defined gain and a variable (xTarget) that we will define now. The two adaptations of the original waypoint tracker differ only on this definition. One uses a constant distance (CD) as expressed in Eq. (7). The other uses yet another gain over the longitudinal distance (Proportional Distance - PD) to the look-ahead point - xTrack - as described in Eq. (8).

ψ& = Kψ (xConst ⋅ V y − yTrack ⋅ Vx )

(7)

ψ& = Kψ (κ ⋅ xTrack ⋅ V y − yTrack ⋅ Vx )

(8)

FIGURE 7. INTERCEPTION CONTOUR

Setting the turn-rate command proportional to the curvature of the interception contour at the aircraft location results on Eq. (11):

rcmd (t ) = V FIGURE 6. TRIGONOMETRIC CONTROLLERS VECTOR FIELDS (Kψ=1.15E-4; XCONST=150METERS; K=0.75)

The CD method presents a satisfactory behavior, although it may diverge in conditions where the tangent to the structure is near to 70-90º. This is where the PD method shows better performance, because it can be set to point to the look-ahead point or very near, if the gain is near the unit. Figure 6 presents a comparison of the vector field associated to the two methods. The PD method “forces” faster convergence to the structure. Interception contour controller. This algorithm was based on a cubic curve approach contour (Fig. 7), as presented on [7]. The reference frame used is fixed to the aircraft. The cubic curve is defined assuming that it coincides with the aircraft position and the slope on that point is zero. It is also tangent to the structure on the look-ahead point. The cubic curve equation then

d 2 y c ( x =0 )

= 2Vβ dt 2  y c ( x = xc ) y c′ ( x = xc )   rcmd (t ) = 2V  3 − x c  xc2 

(11)

This algorithm has a very smooth behavior if the look-ahead point is ahead of the aircraft. On the other hand, if the UAV is pointing closer to, or above 90º, away from the look-ahead point the algorithm produces anomalous behaviors, which may include driving the aircraft away from the structure. Sliding mode controller. We also adapted a line tracker that uses a sliding mode approach [8]. This controller assumes two different courses: desired and commanded. The desired course (Eq. 12) depends only on the relative position to look-ahead point and on the heading of the structure on that same point, as in PD and CD methods (Fig. 5). The commanded course (Eq. 13) is adjusted between the desired course and the aircraft current course.

χ d =ψ s − χ ∞

χc = χ −

1

χ∞

α

2

π

tan −1 (k ⋅ yTrack )

k

2

π 1 + (k ⋅ yTrack )

(

V y track − 2

rcmd = α χ − χ c ~  χ = χ − χ d

κ  χ~  sat  (13) α ε 

)

(14)

x ⇐ x ≤ 1  sat ( x ) =  1 ⇐ x > 0 otherwise − 1 ⇐ x < 0  

rcmd = χ



2

k

π 1 + (k ⋅ yTrack )2

V y track

(12)

(15)

 χ~  + κ ⋅ sat  (16) ε 

SIMULATION RESULTS The four tracking algorithms described in the previous section were implemented in MatLab® and tested in a simulation environment. The kinematic model used is represented in Eq. (17). The turn-rate and the bank angle result from the commanded turnrate conditioned by low pass filters. The aircraft bank angle does not have any influence on the other variables. The bank angle dynamic is faster then the turn-rate dynamic, and by simulating them separately we can compute the image projection accurately.

 x& = V cos(ψ (t )) + W x  y& = V sin (ψ (t )) + W y  1  ψ& (s ) = rcmd (s ) Tr s + 1   −1   1 φ (s ) = tan V g ⋅ rcmd (s )   Tφ s + 1 

(17)

The look-ahead distance for the controllers was set to 100 meters for the PD and the sliding mode methods, to 70 meters for CD method and to 145 meters for the interception contour method. Figure 9a shows that all controllers can handle medium misalignments fairly well. The interception contour controller shows the smallest tracking error on these conditions (Fig. 9b). Big misalignments (Fig 10a) are still well corrected by the CD and sliding mode controllers. Figures 10a and 10c present an example of an odd behavior generated by the interception contour controller caused by the perpendicularity between the course and the tracked path. Both tracking error illustrations (Fig. 9b and 10b) show that the PD method tends to present slowly damped oscillations, although it is also clear that it converges with less longitudinal movement (east direction). If we disregard the odd behavior of the interception contour, the sliding mode controller generates the most oscillating turn-rate reference (Fig. 9c and 10c).

FIGURE 8. SLIDING MODE CONTROLLER VECTOR FIELD

In these expressions χ∞ defines the approximation course relative to the tangent on the structure’s look-ahead point. α sets the proportional gain between the commanded course error and the turn-rate command. This is needed only if the autopilot accepts only course commands. k adjusts the approximation turn trajectory. κ and ε are the sliding mode parameters. The vector field in Fig. 8 presents the courses generated by this sliding mode algorithm. All commanded courses were calculated assuming the aircraft was heading north. That’s why a bigger divergence is observed above the structure. This method achieves easily small tracking errors, but also creates non negligible oscillations, because of chattering.

A) FLIGHT PATH A) FLIGHT PATH

B) TRACKING ERROR B) TRACKING ERROR

C) TURN RATE OUTPUT FIGURE 9. START FROM X=50, Y=-200 AND HEADING 30ºN

C) TURN RATE OUTPUT FIGURE 10. START FROM X=-150, Y=-500 AND HEADING 70ºN

Figure 11 presents the predicted ground video coverage for each one of the four tested algorithms with strong wind perturbations. The sliding mode controller exhibited more chattering (Fig. 11d). The interception contour method was unable to compensate for the misalignment caused by the wind, and presented high tracking errors (Fig. 11c). The performance of the two Trigonometric methods did not change much with respect to the no wind situation (Fig. 11a and 11b).

D) SLIDING MODE METHOD FIGURE 11. GROUND VIDEO SWEEP WITH WIND (16KTS135ºN)

A) PD METHOD

The next tables summarize the results of the comparative study in accordance to the defined parameters. AC track error and Image track error are, respectively, the minimum Euclidian distance on the horizontal plane (XY) between the target path and the aircraft position or the image centre point. Average banking and Saturation level allow a partial evaluation of the flight smoothness. Average banking also allows the estimation of the divergence between the XY aircraft position and the image centre point. TABLE 1. PERFORMANCE DATA - AIRCARFT ALIGNED

AC track error Image track error Average banking Saturation level

PD Trig CD Trig Int Cont 12,7 m 7,0 m 5,8 m 17,7 m 16,6 m 16,9 m 2,3 º 2,2 º 2,2 º 0,00% 0,00% 0,00%

Slid M 4,9 m 17,2 m 5,1 º 5,71%

TABLE 2. PERFORMANCE DATA - AIRCRAFT FAR AWAY

B) CD METHOD

AC track error Image track error Average banking Saturation level

PD Trig CD Trig Int Cont Slid M 112,8 m 113,1 m 115,2 m 105,5 m 118,0 m 127,7 m 120,2 m 119,9 m 6,0 º 2,4 º 7,9 º 3,4 º 5,75% 1,37% 37,32% 2,75%

TABLE 3. PERFORMANCE DATA - 16 KTS WIND

AC track error Image track error Average banking Saturation level

C) INTERCEPTION CONTOUR METHOD

PD Trig CD Trig Int Cont 6,3 m 6,8 m 38,8 m 18,1 m 10,0 m 49,6 m 1,3 º 1,4 º 1,8 º 0,00% 0,00% 0,00%

Slid M 4,3 m 14,8 m 6,7 º 4,71%

TABLE 4. PERFORMANCE DATA - SENSOR NOISE (AVERAGING 5METERS IN POSITION, 2M/S IN SPEED)

AC track error Image track error Average banking Saturation level

PD Trig CD Trig Int Cont 13,2 m 8,5 m 6,0 m 18,6 m 18,2 m 17,0 m 2,3 º 2,4 º 2,2 º 0,00% 0,00% 0,00%

Slid M 6,1 m 18,4 m 7,0 º 7,54%

These results highlight the good performance and robustness of the sliding mode controller in spite of actuator saturation. The CD method exhibits a lower image track error due to the good general tracking and to the low banking imposed. The interception contour controller is the less sensitive to sensor noise (Tab. 4), but also unable to counteract wind disturbances. Table 4 shows that none of the controllers is substantially sensitive to position and ground speed errors. CONCLUSIONS This paper presented a video tracking system for UAV systems. The flight controller has five main states and a safe mode state. The main contribution of this work is the comparison among structure tracking methods. We adapted and compared several nonlinear tracking algorithms: two trigonometric, another based on an interception contour curve and a sliding mode controller. First, we tuned and tested the controllers for a simple kinematic model, with perfect sensors and no disturbances. Later we repeated the experiments with wind, and sensor noise. The sliding mode controller presented the best global behavior. With the addition of wind and sensor noise to the simulation the sliding mode controller exhibited chattering behavior, with significant banking. The interception contour controller led to smoother and accurate trajectories, but stability is only assured when small or medium initial misalignments occur. The method was also unable to compensate for wind disturbances. The trigonometric controllers maintain a similar performance in all simulation conditions. CD trigonometric method has a similar tracking performance to the sliding mode controller, but is smoother. The only drawbacks are the possible instability when the structure local heading is near 90º to the structure average heading and the slower convergence to the structure. The PD trigonometric method presents slowly damped oscillations around the structure with the current tuning, showing the highest tracking error. Future work will address the instability issues on the trigonometric and interception contour algorithms. Wind compensation will also be added to the interception contour method. Further tests will include video analysis algorithms, hardware-in-the-loop simulations and field tests.

ACKNOWLEDGMENTS The research leading to this work was partly funded by Financiamento pluri-anual da Unidade de I&D da FEUP Instituto de Sistemas e Robótica - ISR - Porto. It was also funded by the Fundação para a Ciência e a Tecnologia (FCT - Foundation for Science and Technology) under Phd grant SFRH/BD/40764/2007. We gratefully acknowledge the support of the Portuguese Air Force Academy. REFERENCES [1] Girard, A. R., Howell, A. S., and Hedrick, J. K., 2004. “Border Patrol and Surveillance Missions using Multiple Unmanned Air Vehicles”, Proceedings of the IEEE Conference on Decision and Control, Paradise Island, Bahamas. [2] Almeida, P., Bencatel, R., Gonçalves, G. M., and Sousa, J. B., 2006. “Multi-UAV Integration for Coordinated Missions”, Encontro Científico de Robótica, Guimarães, April. [3] Almeida, P., Bencatel, R., Gonçalves, G. M., Sousa, J. B., and Ruetz, C., 2007. “Experimental results on Command and Control of Unmanned Air Vehicle Systems”, 6th IFAC Symposium on Intelligent Autonomous Vehicles (IAV’07), France, September. [4] Vaglienti, B., Hoag, R., and Niculescu M., 2004. Piccolo system user guide, July. See also URL www.cloudcaptech.com [5] Sousa, J. B., Simsek, T., and Varaiya, P., 2004. “Task planning and execution for UAV teams”, Proceedings of the Decision and Control Conference, Bahamas. [6] Niculescu, M., 2001. “Lateral track control law for Aerosonde UAV”, Proceedings of the 39th AIAA Aerospace Sciences Meeting and Exhibit, Reno. [7] Rathinam, S., Almeida, P., Kim, Z., Jackson, S., Tinka, A., Grossman, W., and Sengupta, R., 2006. “Autonomous searching and tracking of a river using an UAV”. C3UV, University of California, Berkeley. [8] Nelson, D. R., Barber, D. B., McLain T. W., and Beard, R. W., 2007. “Trajectory tracking for Unmanned Air Vehicles with velocity and heading rate constraints”. IEEE Transactions on Robotics, Vol. 23, nº. 3, June.

Video tracking control algorithms for Unmanned Air ...

autopilot, a computer system and video systems. ... the structure to set the best tracking trajectory. ..... Conference on Decision and Control, Paradise Island,.

528KB Sizes 1 Downloads 264 Views

Recommend Documents

Unmanned Air Vehicles for coastal and environmental ...
In mobile network systems, vehicles, sensors and operators interact through .... Operators are able to plan and ... satellite phone (Iridium) for supervision purposes only. ... cameras can provide good quality 3D measurement comparable in.

Video Quality Control Under Cell-Discarding Algorithms ...
ATM cell is properly marked as a high-priority (HP) or low-priority [LP) .... contains an equal number of HP cells and LP cells. (therefore Sn ..... SAC-9, No. 3, pp.

Control Design for Unmanned Sea Surface Vehicles ... - IEEE Xplore
Nov 2, 2007 - the USSV, and the actual hardware and software components used for control of ... the control design problem was developed in our previous.

pdf-1876\hunter-killer-inside-americas-unmanned-air-war.pdf
pdf-1876\hunter-killer-inside-americas-unmanned-air-war.pdf. pdf-1876\hunter-killer-inside-americas-unmanned-air-war.pdf. Open. Extract. Open with. Sign In.

Novel method based on video tracking system for ...
A novel method based on video tracking system for simultaneous measurement of kinematics and flow in the wake of a freely swimming fish is described.

Dynamic reallocation in teams of Unmanned Air Vehicles
mathematical models, so the operators must approve or modify the plan and the ... presented in [4] where predictive models describe the dynamics of assets ...

Coordination of mixed air and ground unmanned ... - IEEE Xplore
Sep 25, 2014 - The system uses stereo-vision depth sensing to provide the obstacle map, and other image pro- cessing techniques to identify and track all the.

Human Motion Detection and Tracking for Video ...
Gradients combined with ADABOOST learning to search for humans in an image .... Frame (b) Background Subtracted Frame with motion region detected (c). Extraction of image ... E. Light Support Vector Machine – Training and Testing. The classificatio

Video-based Hand Movement Tracking | Google Sites
wear any special equipment and that the equipment is relatively cheap. Problems are .... Vision based hand modeling and tracking for virtual teleconferencing.

Non Invasive 3D Tracking for Augmented Video ...
The proposed implementation can make all the tracking and calibration process automatically and has the advantage of not to need any marker in the scene, imposing only a few restrictions in the ... of marker based tracking is the ArToolkit library de

Dynamic reallocation in teams of Unmanned Air Vehicles
The problem of having several Unmanned Air Vehicles organized as teams and ... Enemy Air Defenses (SEAD) mission where teams of Unmanned Combat Air ...

Variational optimal control technique for the tracking of ... - Irisa
many applications of computer vision. Due to the .... consists in computing the functional gradient through finite differences: .... grid point (i, j) at time t ∈ [t0; tf ].

Variational optimal control technique for the tracking of ... - Irisa
IRISA/INRIA Campus de Beaulieu 35042 Rennes Cedex, France npapadak ... a term related to the discrepancy between the state vari- ables evolution law and ...

Tracking Control for Hybrid Systems With State ... - of Maurice Heemels
Index Terms—Asymptotic stability, control system analysis, hy- ... Digital Object Identifier 10.1109/TAC.2012.2223351 ...... changes sign at impacts, and the.

Iterative Learning Control for Optimal Multiple-Point Tracking
on the system dynamics. Here, the improved accuracy in trajectory tracking results has led to the development of various control schemes, such as proportional ...

Tracking Control for Hybrid Systems With State ... - of Maurice Heemels
University of Technology, 5600MB Eindhoven, The Netherlands (e-mail: j.j.b. ...... positions as a Visiting Professor at the University of California Santa Barbara,.

Output Tracking Control for Renewable Wind ...
The Wind Energy Conversion System(WEGS) is composed by a fixed pitch .... the advantages of CMAC, but also solve the nonlinear problem by using T-S fuzzy ...

Output Tracking Control for Renewable Wind Generators Based on ...
The Wind Energy Conversion System(WEGS) is composed by a fixed pitch windmill, ..... mode control of a stand-alone hybrid generation system, ”Proc.

Adaptive Output-Feedback Fuzzy Tracking Control for a ... - IEEE Xplore
Oct 10, 2011 - Adaptive Output-Feedback Fuzzy Tracking Control for a Class of Nonlinear Systems. Qi Zhou, Peng Shi, Senior Member, IEEE, Jinjun Lu, and ...

Vision-only Navigation and Control of Unmanned Aerial ...
Department of Computer Science & Electrical Engineering, OGI School of Science & Engineering, ... Xi'an China in 1995 and M.S. degree in the Institute of.

Multi-Reference Visual Servo Control of an Unmanned ...
Department of Mechanical and Aerospace Engineering, University of Florida, ... This research is supported in part by the NSF CAREER AWARD 0547448, ...... Navigation, and Control Conference, Keystone, Colorado, AIAA 2006-6718, 2006.