Utrecht Multi-Person Motion (UMPM) benchmark

What is it?

Dataset of synchronized video + motion capture data, thus especially very helpful for 2D + 3D human pose estimation evaluations.

3D pose examples

2D pose examples

How to get it?

wget –http-user=umpmuser –http-passwd=YOURPASSWORD -i urls_of_files.txt

  • if you connect to the Internet by a proxy, edit the wget configuration file wgetrc in <your-wget-install-path/etc> and set the proxy address+port
  • wait some days… (each file is mainly raw video data of about ~3GB each file and there are 77 files)

You can use the MLS Viewer to load the .c3d files and plot graphs of the marker coordinates. I found it especially helpful to see which markers are there at all stored in the .c3d files and at which position.

Unzip all the videos

Here is a simple batch file, that - given you call it from the UMPM dataset root dir - unzips you all the videos (that are stored in .xz archives) using 7zip:

cls
for /D %%i in (*) do (
cd %%i/Video
7z e -y *
cd ../..
)

Body Model Used

There are 3 types of .c3d files provided in each GroundTruth folder:

p1_chair_2.c3d
p1_chair_2_ik.c3d
p1_chair_2_vm.c3d

The last one (_vm = virtual markers) is a set of 15 body markers, computed from the real Vicon markers that corresponds to a standard human body model.

For details see here.

How to project 3D poses to 2D?

The dataset contains camera calibration files, like this one:

1: 497.754 0 319.502 
2: 0 500.234 238.342 
3: 0 0 1 
4: -0.366762 0.19347 0.000117164 0.000160627 
5: 1.52268 1.50674 -0.980762 
6: -163.599 182.767 4606.09 

Here is the explanation:

Camera calibration files parameters:

Intrinsic parameters:
Line 1-3 Camera matrix
Line 4   Distortion coefficients

Extrinsic parameters
Line 5   Rotation vector
Line 6   Translation vector 

You can then use the perspective projection camera model, to map a 3D point to a 2D point. Here is a solution using OpenCV:

Pose2D* C3DParser::Project3DPoseTo2DPose(Pose3D* p3d)
{
    CvMat* camera_matrix;
    CvMat* dist_coeffs; 
    CvMat* rotVec;
    CvMat* transVec;
 
    camera_matrix  = cvCreateMat(3, 3, CV_32F);
    dist_coeffs    = cvCreateMat(1, 4, CV_32F);
    rotVec         = cvCreateMat(3, 1, CV_32F);
    transVec       = cvCreateMat(3, 1, CV_32F);
 
    // Set camera matrix
    for (int i=0; i<3; i++)
    {
        for (int j=0; j<3; j++)
        {
            cvmSet(camera_matrix,   i,j, IntrinsicCamMatrix[i][j]);
        }
    }
 
    // Set distorsion coefficients
    for (int i=0; i<4; i++)
        cvmSet(dist_coeffs, 0, i, DistorsionPolynomParams[i]);
 
    // Set rotation vector
    for (int i=0; i<3; i++)
        cvmSet(rotVec, i, 0, RotationVec[i]);
 
    // Set translation vector
    for (int i=0; i<3; i++)
        cvmSet(transVec, i, 0, TranslationVec[i]);
 
 
    // Prepare array of points to project
    CvMat* object_points = cvCreateMat(15, 3, CV_32F);
    for (int MarkerNr=0; MarkerNr<NrMarkers; MarkerNr++)
    {
        // Get the 3d point
        float x = p3d->MarkerPos[MarkerNr][0];
        float y = p3d->MarkerPos[MarkerNr][1];
        float z = p3d->MarkerPos[MarkerNr][2];
 
        // Copy point to OpenCV matrix
        cvmSet(object_points, MarkerNr, 0,   x);
        cvmSet(object_points, MarkerNr, 1,   y);
        cvmSet(object_points, MarkerNr, 2,   z);
    }
 
 
    // Prepare data structure for 2D points (the projections of the 3D points)
    CvMat* image_points = cvCreateMat(15, 2, CV_32F);
 
    // Now map 3D points to 2D using OpenCV
    cvProjectPoints2(object_points,
 
                     rotVec,
                     transVec,
                     camera_matrix,
                     dist_coeffs,
 
                     image_points);
 
    // Generate resulting 2D pose
    Pose2D* pose2D = new Pose2D();
    for (int MarkerNr=0; MarkerNr<15; MarkerNr++)
    {
        pose2D->MarkerPos[MarkerNr][0] = cvmGet( image_points, MarkerNr, 0 );
        pose2D->MarkerPos[MarkerNr][1] = cvmGet( image_points, MarkerNr, 1 );
    }
 
    return pose2D;
 
} // Project3DPoseTo2DPose

Do you want to know more?

  • read the corresponding dataset paper (HICV workshop, in conjunction with ICCV 2011)
 
public/utrecht_multi-person_motion_umpm_benchmark.txt · Last modified: 2012/05/03 08:00 (external edit) · []
Recent changes RSS feed Powered by PHP Valid XHTML 1.0 Valid CSS Driven by DokuWiki