





Workshops & tutorials will be held on Oct. 20(SAT), 2012
* Miniworkshop on Nonlinear Synchronization
Date & Time : October 18 (THU) 13:00~15:00
Room : 302
Presentation : Korean
Admission : free of charge for ICCAS 2012 participants
Program
13:00~13:40

 Synchronization of Chaotic FitzHughNagumo Neuron Models
Prof. KeumShik Hong / Pusan National Univ., Mechanical Engineering Department  13:40~14:20

 Recent results on the complete synchronization of Kuramoto model on networks
Prof. SeungYeal Ha / Seoul National Univ., Department of Mathematical Sciences  14:20~15:00

 Synchronization, Strong Coupling, and Robustness
Prof. Hyungbo Shim / Seoul National Univ., Electrical Engineering Department 

• Tutorial 1 
Filtering Theory and Applications to Integrated Cancelled!
ÇÊÅÍ¸µ ÀÌ·Ð ¹× º¹ÇÕÇ×¹ý ÀÀ¿ë 
• Time 
8:3015:00 (Oct. 20, 2012) 
• Organizer 
Prof. Chan Gook Park (Seoul National University) 
• Fee 
Student 140,000Won, Regular 200,000Won 
• Presentation 
Korean 
• Program 
The principal goal of this tutorial is to provide an introduction to the basic principle and
applications of Linear Kalman filter, Unscented Kalman filter and Particle filters to integration
of GPS (Global Positioning System) with Inertial Navigation Systems and Dead Reckoning Systems.
Fundamental concept on filtering technique with detailed mathematical development will be introduced,
so that one can build up solid background on the basics of Kalman filter as well as general filtering theory.
Considering the importance of Kalman filtering in the practicing areas of GPS/INS integration, practical
applications of the Kalman filter to advanced car navigation are also presented following the basic theory.
The workshop will deliver highly useful knowledge and experience for graduate students working on related research,
scientists of government institutes, and field engineers being involved with practical projects.
The oneday tutorial consists of three parts. In the first session, introduction and mathematical
developments of the linear Kalman filter theory is scheduled. In the second session, more advanced
filters such as the unscented filter and the particle filters will be discussed. In the last session,
various useful aspects of GPS/DR integration will be discussed for practical applications.
08:30 ~ 10:10: Lecture 1 (Kalman Filtering)
10:30 ~ 12:10: Lecture 2 (GPS/INS with NonLinear Filters)
13:20 ~ 15:00: Lecture 3 (Application : Integrated Navigation)

• Tutorial 2 
Model Predictive Control: Online optimization based approach vs. explicit approach

• Time 
8:3017:00 (Oct. 20, 2012) 
• Organizer 
Prof. Jay H. Lee (KAIST) / Prof. E. N. Pistikopoulos (Imperial College, London) 
• Fee 
Student 180,000Won, Regular 250,000Won 
• Presentation 
English 
• Program 
The principal goal of this tutorial is to provide an introduction
to the basic principle and applications of linear and nonlinear model predictive control.
The first half of the tutorial will present the traditional approach of employing online
optimization. Stability and optimality in closed loop will be discussed. Methods to speed
up the online optimization for problems requiring fast sampling rates will be discussed.
The second half of the tutorial will present an explicit MPC approach in which multiparametric
programming is used to parameterize the MPC control law explicitly offline.
The main advantage of explicit MPC is that the online optimization is replaced by a table
lookup, which can be considerably faster. Several applications of explicit MPC will be
presented.
08:30 ~ 10:00: Lecture 1 (Linear MPC)
10:30 ~ 12:00: Lecture 2 (Nonlinear MPC)
13:30 ~ 15:00: Lecture 3 (Explicit MPC Part I)
15:30 ~ 17:00: Lecture 3 (Explicit MPC Part II)

• Tutorial 3 
The operational space control framework ·Îº¿ ÆÈ Á¦¾î ¹æ¹ý

• Time 
8:3012:10 (Oct. 20, 2012) 
• Organizer 
Prof. Jaeheung Park (Seoul National University) 
• Fee 
Student 90,000Won, Regular 120,000Won 
• Presentation 
Korean 
• Program 
The operational space control framework provides a means to direct
task space control of the robot by fully utilizing the robot dynamics without using inverse kinematics.
The operational space or task space is typically defined to be the position and orientation of the endeffector.
More generally, it can be defined to be the position and orientation of any link of the robot or any other
quantities that can be mathematically described such as the center of the mass of the system.
Understanding of the operational space control framework gives you not only the knowledge about
taskoriented control framework but also the insight about the robot dynamics. The lecture will
first go over the basic robotics material  kinematics and joint space dynamics.
Then, the operational space (task space) dynamics and control will be presented.
Finally, the taskposture decomposition approach using task redundancy and hybrid positionforce
control will be discussed.
08:30 ~ 10:10: Lecture 1 (Kinematics and dynamics in operational space)
10:30 ~ 12:10: Lecture 2 (The operational space control framework and taskposture decomposition)

• Tutorial 4 
Robot Vision: Principles and Applications
·Îº¿ºñÀü: ¿ø¸®¿Í ÀÀ¿ë

• Time 
13:3017:00 (Oct. 20, 2012) 
• Organizer 
Prof. In So Kweon (KAIST) 
• Fee 
Student 90,000Won, Regular 120,000Won 
• Presentation 
Korean 
• Program 
Robot vision gives robots the ability to perceive the external world in order to perform tasks such as navigation, visual tracking, object detection and recognition.
Most robot vision algorithms run in four steps ? image acquisition, lowlevel image processing, midlevel image matching, and highlevel information extraction. This tutorial introduces the basic principles of robot vision and the stateoftheart technologies including some of the realworld applications.
Specifically, the topics will include the geometric/photometric camera calibration, image features and extractions, feature matching and recognition, and 3D reconstruction.
For 3D reconstruction of static and dynamic scenes, we introduce the details of several robust robot vision methods ranging from the image enhancement to the design of novel camera systems. We also present a unified framework for sensor fusion: (i) ¡°camera + depth¡± fusion camera systems, (ii) a fast bundle adjustment based approach for largescale dataset, (iii) a novel codedlight photometric stereo for modeling 3D dynamic scenes. This new framework allows boosting the advantages of two sensor systems and complements the weakness of the two. As an important application of robot vision, we demonstrate the robustness of the methods by automatically reconstructing a largescale environment, such as KAIST campus.
13:30 ~ 15:00: Lecture 1 (Introduction to robot vision systems)
15:30 ~ 17:00: Lecture 2 (Visionbased localization and 3D mapping)

* Tutorial 3 & 4 µ¿½Ã ½ÅÃ»½Ã Áß½Ä Á¦°ø
 
