Motion Texture. Harriet Pashley Advisor: Yanxi Liu Ph.D. Student: James Hays. 1. Introduction

Similar documents
A Comparison Study of Four Texture Synthesis Algorithms on Regular and Near-regular Textures

Texture Synthesis and Manipulation Project Proposal. Douglas Lanman EN 256: Computer Vision 19 October 2006

Texture. CS 419 Slides by Ali Farhadi

ABSTRACT Departures from a regular texture pattern can happen in many different dimensions. Previous related work has focused on faithful texture synt

Figure 1: A sampler of different types of textures. Figure 2: Left: An irregular texture overlaid with its lattice. Right: its near-regular counterpar

Motion Interpretation and Synthesis by ICA

ISSN: (Online) Volume 2, Issue 5, May 2014 International Journal of Advance Research in Computer Science and Management Studies

An Improved Texture Synthesis Algorithm Using Morphological Processing with Image Analogy

values). In order to find such tiles we developed an algorithm [15], called region of dominance, for locating the underlying lattice of a given patter

Image Composition. COS 526 Princeton University

Generating Different Realistic Humanoid Motion

Final Exam Schedule. Final exam has been scheduled. 12:30 pm 3:00 pm, May 7. Location: INNOVA It will cover all the topics discussed in class

Data-driven methods: Video & Texture. A.A. Efros

Data-driven methods: Video & Texture. A.A. Efros

Motion Synthesis and Editing. Yisheng Chen

Epitomic Analysis of Human Motion

Thiruvarangan Ramaraj CS525 Graphics & Scientific Visualization Spring 2007, Presentation I, February 28 th 2007, 14:10 15:00. Topic (Research Paper):

Video Texture. A.A. Efros

Measuring the Steps: Generating Action Transitions Between Locomotion Behaviours

Automatic Generation of An Infinite Panorama

Universiteit Leiden Opleiding Informatica

A Dynamics-based Comparison Metric for Motion Graphs

Last Time? Animation, Motion Capture, & Inverse Kinematics. Today. Keyframing. Physically-Based Animation. Procedural Animation

Synthesis of Near-regular Natural Textures

CS 231. Motion Capture Data I. The Pipeline. Bodenheimer et al

Graph-based High Level Motion Segmentation using Normalized Cuts

Last Time? Animation, Motion Capture, & Inverse Kinematics. Today. Keyframing. Physically-Based Animation. Procedural Animation

GRAPH-BASED APPROACH FOR MOTION CAPTURE DATA REPRESENTATION AND ANALYSIS. Jiun-Yu Kao, Antonio Ortega, Shrikanth S. Narayanan

Texture April 17 th, 2018

Tiled Texture Synthesis

animation projects in digital art animation 2009 fabio pellacini 1

Admin. Data driven methods. Overview. Overview. Parametric model of image patches. Data driven (Non parametric) Approach 3/31/2008

Homework 2 Questions? Animation, Motion Capture, & Inverse Kinematics. Velocity Interpolation. Handing Free Surface with MAC

Lecture 6: Texture. Tuesday, Sept 18

The Promise and Perils of Near-Regular Texture

CMSC 425: Lecture 10 Skeletal Animation and Skinning

PatchMatch: A Randomized Correspondence Algorithm for Structural Image Editing

Dynamics Based Comparison Metrics for Motion Graphs

Last Time? Inverse Kinematics. Today. Keyframing. Physically-Based Animation. Procedural Animation

Motion Editing with Data Glove

Texture. The Challenge. Texture Synthesis. Statistical modeling of texture. Some History. COS526: Advanced Computer Graphics

An Algorithm for Seamless Image Stitching and Its Application

Analyzing and Segmenting Finger Gestures in Meaningful Phases

Scaled representations

Annotation of Human Motion Capture Data using Conditional Random Fields

I Chen Lin, Assistant Professor Dept. of CS, National Chiao Tung University. Computer Vision: 6. Texture

Optimal motion trajectories. Physically based motion transformation. Realistic character animation with control. Highly dynamic motion

Path Planning Directed Motion Control of Virtual Humans in Complex Environments

Texture Synthesis. Michael Kazhdan ( /657)

Shweta Gandhi, Dr.D.M.Yadav JSPM S Bhivarabai sawant Institute of technology & research Electronics and telecom.dept, Wagholi, Pune

Image Processing Techniques and Smart Image Manipulation : Texture Synthesis

Announcements: Quiz. Animation, Motion Capture, & Inverse Kinematics. Last Time? Today: How do we Animate? Keyframing. Procedural Animation

Directable Motion Texture Synthesis

CSCI 1290: Comp Photo

CS-184: Computer Graphics

To Do. Advanced Computer Graphics. The Story So Far. Course Outline. Rendering (Creating, shading images from geometry, lighting, materials)

Median filter. Non-linear filtering example. Degraded image. Radius 1 median filter. Today

Non-linear filtering example

Animation, Motion Capture, & Inverse Kinematics. Announcements: Quiz

arxiv: v1 [cs.cv] 22 Feb 2017

Texture. Texture. 2) Synthesis. Objectives: 1) Discrimination/Analysis

Character Animation 1

More Texture Mapping. Texture Mapping 1/46

Angular momentum guided motion concatenation. Introduction. Related Work. By Hubert P. H. Shum *, Taku Komura and Pranjul Yadav

Use of Shape Deformation to Seamlessly Stitch Historical Document Images

Texture. COS 429 Princeton University

Texture April 14 th, 2015

Motion Graphs for Character Animation

Image De-fencing. Abstract. 1. Introduction

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy

More details on presentations

Overview. Animation is a big topic We will concentrate on character animation as is used in many games today. humans, animals, monsters, robots, etc.

A Method of Hyper-sphere Cover in Multidimensional Space for Human Mocap Data Retrieval

Hexagonal Image Quilting for Texture Synthesis

+ = The Goal of Texture Synthesis. Image Quilting for Texture Synthesis & Transfer. The Challenge. Texture Synthesis for Graphics

Topics. Image Processing Techniques and Smart Image Manipulation. Texture Synthesis. Topics. Markov Chain. Weather Forecasting for Dummies

To Do. History of Computer Animation. These Lectures. 2D and 3D Animation. Computer Animation. Foundations of Computer Graphics (Spring 2010)

Volume Editor. Hans Weghorn Faculty of Mechatronics BA-University of Cooperative Education, Stuttgart Germany

MOTION CAPTURE DATA PROCESSING - MOTION EDITING / RETARGETING - MOTION CONTROL / GRAPH - INVERSE KINEMATIC. Alexandre Meyer Master Informatique

animation computer graphics animation 2009 fabio pellacini 1 animation shape specification as a function of time

Real-Time Motion Transition by Example

Articulated Structure from Motion through Ellipsoid Fitting

animation computer graphics animation 2009 fabio pellacini 1

Texture Synthesis. Darren Green (

Motion Simulation of a Modular Robotic System

Computer Animation and Visualisation. Lecture 3. Motion capture and physically-based animation of characters

Achieving Good Connectivity in Motion Graphs

Machine Learning for Video-Based Rendering

Light Field Occlusion Removal

Decorating Surfaces with Bidirectional Texture Functions

CSE452 Computer Graphics

Character Animation. Presented by: Pam Chow

CS 534: Computer Vision Texture

Modeling Physically Simulated Characters with Motion Networks

Periodicity Extraction using Superposition of Distance Matching Function and One-dimensional Haar Wavelet Transform

CSC 2529F Computer Animation Graduate Project -Collaborative Motion Graphs. Alex & Philipp Hertel

Face Hallucination Based on Eigentransformation Learning

Data-driven Approaches to Simulation (Motion Capture)

Hierarchical Matching Techiques for Automatic Image Mosaicing

Final Exam Assigned: 11/21/02 Due: 12/05/02 at 2:30pm

Transcription:

Motion Texture Harriet Pashley Advisor: Yanxi Liu Ph.D. Student: James Hays 1. Introduction Motion capture data is often used in movies and video games because it is able to realistically depict human motion. One of its greatest limitations is the inflexibility for reuse in different projects. To get the exact shot that you would like requires an individual motion capture session. This is not only costly but time consuming as well. We would like to be able to easily take existing data and synthesize more of it or be able to modify it slightly to suit our needs. Motion capture databases such as Carnegie Mellon s [http://graphics.cs.cmu.edu] divide clips into different actions. It is difficult to piece these clips together in a reasonable way to create a desired longer motion. There have been many successful results in texture synthesis for a variety of textures using different approaches. Some synthesize texture pixel by pixel based on the pixel s neighborhood [Efros and Leung, 1999], while others use patches which are overlapped and then sewn together along a certain optimal seam [Efros and Freeman, 2001]. Recently there have been many advances made in the texture synthesis realm which can be applied to the growth of motion capture data, namely the specific treatment of near-regular textures [Liu, Lin, Hays, 2004]. This specific classification of textures has regularities and symmetries with the addition of a limited amount of random noise and imperfections. Near-regular textures are seen all over in nature, including cloth patterns, brick walls, and even gait patterns [Liu, Collins, Tsin, 2002]. Algorithms have been tailored to nearregular textures to exploit their regularity while still preserving their subtle differences [Liu, Lin, Hays, 2004]. These synthesis methods use PCA to find the average tile and then deviate from it to form new tiles. These tiles are then pieced together to form a larger texture. We extend this concept to stride cycles in motion capture data for motion capture data. In Fig. 1 the repetitive nature of the foot and arm joints are apparent over the course of three different cycles. Texture synthesis algorithms are able to faithfully reproduce arbitrary amounts of a given input texture. Arms Figure 1: The frame of the motion is viewed as a row of the graph. Time is increasing from top to bottom. The periodicity can be observed in the feet and the arm joints across three different cycles. 2. Related Work Feet There have been many papers published on the subject of motion synthesis. Researchers have been able to piece together segments of motion capture data by specifying descriptions of motion in a specific order, such as walk, run, jump [Arikan, Forsyth, and O Brien, 2003]. To reduce an animator s workload, research has been done to key-frame a small number of joints and use motion capture data to fill in the remaining joints based on the correlation between the joints [Pullen and Bregler, 2002]. Our work deviates from previous work which synthesizes human motion by segmenting data together which is statistically similar. Motion is viewed as a set of linear dynamic systems and their accompanying matrix of transitions which contains how likely one system will transition to another [Li, Wang and Shum, 2002]. The most widely used method to transition between two

motion clips is linear interpolation. The physical correctness of interpolated motions has been justified by evaluating it in terms of standard physical properties such as linear and angular momentum, the contact between feet and the floor and the continuity of velocity. However, it is still not guaranteed to look natural. [Safanova et al., 2005]. In this paper we will examine each category of texture synthesis algorithms as mentioned above as well as explore the importance of correlation between joints. 3. Texture Synthesis Algorithms 3.1 Pixel-Based Methods In Efros and Leung s 1999 paper they were able to create arbitrary amounts of texture by creating a data structure of textons (see Fig. 2). Textons define the basic unit of a texture, which contain the color information of the center pixel and of those pixels in its immediate vicinity which are called neighbor pixels. Textons can be created for all pixels of the input texture. When synthesizing, the texton t composed of a neighborhood of size n around the current pixel p is compared with each texton of the input texture t by minimizing the distance between them using distance metric d. The center pixel of texton t with the smallest distance from the center of t is the value of p. This process is repeated for each pixel to be synthesized. Applying this to motion capture data, we equated each pixel to each angle value a at frame f and degree of freedom of joint x. Our texton, which we will refer to as a motion texton is the neighboring angles around the current angle. This is not the same definition used by Li, Wang, and Shum (2002). We deviate from Efros and Leung s method only in the search space for each angle a. If we searched over the entire original motion we could synthesize a head joint angle with a foot joint angle. To avoid this, we minimize the distance between a and a, where a ' a1... az and z is the length of the input motion sequence. This produced a visually pleasing result when given a sufficiently large neighborhood size. The neighborhood size was dependent on the gait cycle, about 20 frames on average. Figure 2: A visualization of a texton. 3.2 Patch-Based Methods Another approach for texture synthesis is the image quilting method of Efros and Freeman. They create a set of blocks from the input image which are all such overlapping blocks in the input texture image of a size specified by the user. The initial block is selected and placed in the larger image. The patch set is searched for the block which has the least variation in the overlapped region between the two neighboring blocks (see [Efros and Freeman, 2001). This process is repeated for each patch as they are placed in the larger image. An optimal path is computed in the overlapped region to seamlessly join the patches. Relating this to motion capture data, we take two motions m1 and m2 (which may be the same motion) and search for the best overlap between them over n frames. This occurs at frame t1 of m1 and t2 of m2. We compute the optimal seam between m1 and m2 joint by joint, where x is each joint as seen in this pseudocode: for i = t1 to t1 + n for j = t2 to t2 + n calculate distance between m1x[i] and m2x[j] return i and j of minimum distance computed This means that over the n transitional frames each joint x could transition between m1 and m2 at different frames for a smoother transition. When m1 and m2 are the same motion, a threshold needs to be set for how quickly the motion can transition (the minimum value of t1). Without this threshold the best overlap will be computed at frame 1.

3.3 Tile-Based Methods When using a pixel based texture synthesis algorithm the size of the neighborhood has a large impact on the quality of the results. A small neighborhood will not incorporate the structure of the texture into the distance calculation. Even an arbitrarily large window does not capture the correct regularity. For all of the infinite number of regular textures there are only 17 different lattice/tile structures that define the symmetry of the texture. The lattice can be detected in a texture and the tiles used for synthesis. Tiles are placed and then blended together using dynamic programming. This preserves the intensity/color variations of the texture as well as the regularity of the texture [see Liu, Tsin and Lin, 2005]. While the structural regularity is not preserved when using a large neighborhood size, the correct choice of neighbors is an important part of patch-based texture synthesis methods. 4. Correlation Between the Degrees of Freedom of Joints The format of motion capture data is that each joint is given a number and these numbered joints are laid out in sequential order in an array. Thus adjacent neighbors in the data structure may not be logical numbers. For example, in Fig. 3 the lower back joint is stored next to the right toe joint. When using Efros and Leung s method (1999) these kinds of neighborhoods may be meaningless. To make the neighborhood meaningful we look at the correlation score between joints. We quantify this correlation using Pearson s correlation coefficient, ρ. This coefficient is a value between -1 and 1 which measures the linear relationship between two sets of numbers s1 and s2. 1 is a perfect correlation between s1 and s2, 0 is no relationship between the two sets, and -1 is an inverse relationship between the two. Frame t Figure 3: The physical neighbors of a joint are not logical neighbors of a joint. The lower back and right toe are not correlated joints, yet using Efros and Leung s method (1999) their values would affect one another when synthesizing a joint. Applying this to our motion synthesis, we compute ρ for every pair of degrees of freedom in motion m. If " = ( xy) ( x)( y), xy " x # y xy =. We store these values in an array and use them to weight the value of the physical neighbors of the degrees of freedom of joint x during synthesis. The same algorithm is used as described in the previous section with the exception of the distance metric d. For each degree of freedom of joint x, d = n i= 1 abs( neighbor _ weights " frame i ), where n is the size of the neighborhood. The absolute value is taken because we want to weigh positively and negatively correlated joints equally as they provide the same amount of information about the motion. 5. PCA Tiling of Motion Joints 1 n A near-regular texture [Liu, Lin and Hays, 2004] can be decomposed into two parts, a regular texture and its deformation field. When synthesizing a brick wall the authors find the average brick which they call the mean tile. They then use PCA to construct the bases of the tile and create a linear combination of the mean tile and its bases. We look at motion capture data in this light we manually segment a walk into strides. We define a stride to start at the first frame at which the right food touches the T

ground and the end of the stride to be the last frame before the right foot touches the ground at the end of the walking cycle. These strides are then averaged together to form the mean stride. The bases are constructed and new motion can be created by forming a linear combination of the mean stride and its bases. The strides created by this method can be pieced together using any of the three methods described in Section 3. One of the difficulties of this approach is to create motions without foot sliding. To compute the mean stride, cycles must be interpolated to be the length of the longest stride. This causes the shorter cycles to be stretched, thus causing feet to slide. We are currently working on ways to mitigate this problem as explained in our future work. Also, it would be instructive to find a means of computing a threshold for eigenvector coefficients to produce realistic motions. 6. Future Work We are looking to expand our work firstly by acquiring better data for PCA. By using many cycles of a motion a more meaningful average stride can be computed. We are going to also examine our method with data where the feet are fairly regular (similar stride lengths and times) and the hands are uncorrelated. We would like to expand the autonomy of our method, by being able to automatically detect the beginning and end of each cycle in a motion. This will greatly reduce the pre-processing steps of our current method for synthesis using PCA. Currently, our method needs a postprocessing step to clean up foot sliding. We are hoping to use correlation to automatically detect sliding feet. When feet begin to slide we hypothesize that the correlation between the velocity of the feet and velocity of the root will change. The velocity of the root can be calculated by finding the difference between the different positions of the root over the duration of the motion. Finding the velocity of the feet in world coordinates would allow for a comparison between the velocities of the two joints. We are also aware that it may be better to synthesize each joint of the motion instead of synthesizing the degrees of freedom of the joint separately. This is not straightforward as different joints have different numbers of degrees of freedom. It involves turning each joint from a Euler angle representation into a quaternion representation. This affects how the correlation between joints is computed as the correlation of two vectors is yet to be defined precisely. Our computational treatment of motions as near-regular textures may provide valuable insights into the improvement of motion transplants [Ikemoto and Forsyth, 2004]. Using graph cuts [Kwatra, Schödl, et al, 2002] we will be able to stitch together subsets of different motions across time, hopefully improving existing results. REFERENCES Arikan, O., Forsyth, D.A., and O Brien, J.F. 2003. Motion synthesis from annotations. ACM Transactions on Graphics, 22(3):402:408. Efros, A.A. and Freeman, W.T. 2001. Image quilting for texture synthesis and transfer. In SIGGRAPH, pp. 35-42. Efros, A.A. and Leung, T.K. 1999. Texture synthesis by non-parametric sampling. In International Conference on Computer Vision. Ikemoto, L. and Forsyth, D.A. 2004. Enriching a motion collection by transplanting limbs. ACM Transactions of Graphics, pp. 99-108. Kwatra, V., Schödl, A, et al. Graphcut Textures: Image and Video Synthesis Using Graph Cuts. ACM Transactions of Graphics, 22(3):277:286. Li, Y., Wang, T., Shum, H.Y. 2002. Motion Texture: a two-level statistical model for character motion synthesis. ACM Transactions of Graphics, pp. 465-472. Liu, Y., Lin, W.C., and Hays, J. 2004. Near-regular texture analysis and manipulation. ACM Transactions of Graphics, 23(3):368-376. Liu, Y., Collins,R.T. and Tsin, Y. 2002 ``Gait Sequence Analysis using Frieze Patterns'', European Conference on Computer Vision. Copenhagen, Denmark. May 28-31, 2002. Liu, Y., Tsin, Y., Lin, W. The Promise and Perils of Nearregular Texture. International Journal of Computer Vision, Vol. 62, No. 1-2, April, 2005, pp. 145-159. Pullen K., Bregler C. 2002. Motion capture assisted animation. SIGGRAPH 02.

Safanova, A., Hodgins, J.K. 2005. Analyzing the physical correctness of interpolated human motion. ACM Transactions on Graphics, pp. 171 180.