合肥生活安徽新聞合肥交通合肥房產(chǎn)生活服務(wù)合肥教育合肥招聘合肥旅游文化藝術(shù)合肥美食合肥地圖合肥社保合肥醫(yī)院企業(yè)服務(wù)合肥法律

        代做CS 532、Collaboration 代寫

        時(shí)間:2024-02-19  來源:合肥網(wǎng)hfw.cc  作者:hfw.cc 我要糾錯(cuò)



        CS 5**: Homework Assignment 1
        Due: February 15th, 5:59PM
        Department of Computer Science
        Stevens Institute of Technology
        Collaboration Policy. Homeworks may be done individually or in teams of two. It is acceptable
        for students of different teams to collaborate in understanding the material but not in solving the
        problems. Use of the Internet is allowed, but should not include searching for previous solutions
        or answers to the specific questions of the assignment. I will assume that, as participants in a
        graduate course, you will be taking the responsibility of making sure that you personally
        understand the solution to any work arising from collaboration.
        Late Policy. 3% penalty for partial 24-hour period of delay.
        Submission Format. Electronic submission on Canvas is mandatory. Submit in a zip file
        contaning
        PDF file:
        • at most one page of text explaining anything that is not obvious. Also include the
        • richly documented source code (excluding libraries),
        • points used in the computation,
        • resulting images,
        • Instructions for running your code, including the execution command string that
        would generate your results.
        Separate directory for all code
        Separate directory for all generated imagery
        Problem 1. (50 points)
        The goal is for you to apply your knowledge of Homography estimation from a set of image
        features in order to perform a simple image warping task. In particular, you are expected to
        implement
        2
        a) The DLT algorithm for homography estimation using pixel feature locations (15pts)
        b) 2D Bilinear interpolation to render the output image (10 pts)
        c) The DLT algorithm for homography estimation using line feature locations (25pts)
        Download the image of the basketball court from the Canvas course website. Then, generate a
        blank 940 × 500 image and warp the basketball court only from the source image, where it
        appears distorted, to the new image so that it appears as if the new image was taken from directly
        above.
        Notes.
        • You are allowed to use image reading and writing functions, but not homography estimation
        or bilinear interpolations functions.
        • For P1a, Matlab, gimp or Irfanview (Windows only) can be used to click on pixels and
        record their coordinates.
        • For P1c, line coordinates you are free to use the same (four) corner points used in P1a (and
        define lines based on their coordinates) or determine new lines (e.g. lines in the image).
        Problem 2. (50 points) Object Centered motion
        The goal is for you to apply your knowledge of the pinhole camera model by controlling both the
        internal and external parameters of a virtual to generate a camera path that “locks-in” to foreground
        object (i.e. the foreground object should be and retain a constant size in the image throughout the
        entire capture sequence).
        In order to approximate a photorealistic image generation, you are provided a dense point cloud
        augmented with RGB color information. To obtain a rendered image you can use the provided
        rendering function PointCloud2Image, which takes as input a projection matrix and transforms the
        3D point cloud into a 2D image (see below for details). Your task will be to:
        1) Design a path that performs a half circle around (i.e. centered on) the foreground object (in this
        case a fish statue)
        2) Design a sequence of projection matrices corresponding to each frame of capture sequence
        3) Use the provided code to render each of the individual images (capture frames).
        The main challenges are
        3
        a) Setup the camera extrinsics and intrinsics to achieve the desired initial image position
        b) Design a suitable pose interpolation strategy
        Setup: Start the sequence using the camera’s original internal calibration matrix K (provided in the
        data.mat file) and position the camera in such a way that the foreground object occupies in the
        initial image a bounding box of approx 400 by 640 pixels (width and height) respectively.
        (Per reference, positioning the camera at the origin renders the foreground object within a
        bounding box of size 250 by 400 pixels).
        Notes: Implementation details & Matlab Code
        The file data.mat contains the scene of interest represented as a 3D point cloud, the camera internal
        calibration matrix to be used along with the image rendering parameters. All these variables are to
        be loaded into memory and need not be modified.
        The file PointCloud2Image.m contains the point cloud rendering function whose signature is {img
        =PointCloud2Image(P,Sets3DRGB,viewport,filter_size)}. P denotes a 3x4 projection matrix and
        should be the only parameter you will need to vary when calling this function, as the remaining
        parameters should remain constant.
        A simplified example of how to use the function is included in the file SampleCameraPath.m . The
        provided sample code does not does the circling effect, it only displaces the camera towards the
        scene. It will be your task to manipulate the camera internal and external parameters to get the
        desired result.
        The pointcloud data is contained in two variables: BackgroundPointCloudRGB and
        ForegroundPointCloudRGB, each comprising of a 6xN matrix. The first three rows describe the 3D
        coordinates of a point while the last three contain the corresponding RGB values. You may need to
        examine the ForegroundPointCloudRGB to determine the required camera positions. The pointcloud
        was generated from a single depthmap where the foreground object was masked out and its depth
        reduced by half.

        Figure 2. Birds eye view of the observed scene
        The generated video should be approximately 5 seconds in length at a frame rate of 5Hz.
        WMV will be the only format accepted. 

        請加QQ:99515681  郵箱:99515681@qq.com   WX:codehelp 

        掃一掃在手機(jī)打開當(dāng)前頁
      1. 上一篇:代寫CS2910、代做c/c++語言程序
      2. 下一篇:代寫6CCS3ML1、代做Python程序設(shè)計(jì)
      3. 無相關(guān)信息
        合肥生活資訊

        合肥圖文信息
        出評 開團(tuán)工具
        出評 開團(tuán)工具
        挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
        挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
        戴納斯帝壁掛爐全國售后服務(wù)電話24小時(shí)官網(wǎng)400(全國服務(wù)熱線)
        戴納斯帝壁掛爐全國售后服務(wù)電話24小時(shí)官網(wǎng)
        菲斯曼壁掛爐全國統(tǒng)一400售后維修服務(wù)電話24小時(shí)服務(wù)熱線
        菲斯曼壁掛爐全國統(tǒng)一400售后維修服務(wù)電話2
        美的熱水器售后服務(wù)技術(shù)咨詢電話全國24小時(shí)客服熱線
        美的熱水器售后服務(wù)技術(shù)咨詢電話全國24小時(shí)
        海信羅馬假日洗衣機(jī)亮相AWE  復(fù)古美學(xué)與現(xiàn)代科技完美結(jié)合
        海信羅馬假日洗衣機(jī)亮相AWE 復(fù)古美學(xué)與現(xiàn)代
        合肥機(jī)場巴士4號(hào)線
        合肥機(jī)場巴士4號(hào)線
        合肥機(jī)場巴士3號(hào)線
        合肥機(jī)場巴士3號(hào)線
      4. 上海廠房出租 短信驗(yàn)證碼 酒店vi設(shè)計(jì)

        主站蜘蛛池模板: 国产高清视频一区二区| 国产精品视频一区二区三区四| 国产成人高清视频一区二区 | 亚洲国产高清在线一区二区三区| 亚洲av无码不卡一区二区三区| 国产成人无码精品一区不卡| 精品熟人妻一区二区三区四区不卡| 亚洲一区二区视频在线观看| 日本一区二区三区在线观看视频 | 国产一区二区三区免费视频| 亚洲伦理一区二区| 国产午夜精品一区二区三区 | 国产综合一区二区在线观看| 国产肥熟女视频一区二区三区 | 国模吧一区二区三区| 区三区激情福利综合中文字幕在线一区亚洲视频1 | 国产香蕉一区二区三区在线视频| 中文字幕亚洲综合精品一区| 亚洲AV本道一区二区三区四区| 久久99精品免费一区二区| 亚洲成av人片一区二区三区 | 亚洲a∨无码一区二区| 亚洲一区无码中文字幕乱码| 亚洲国产精品综合一区在线 | 中文字幕一区在线观看| 国产亚洲福利一区二区免费看| 国内精品一区二区三区最新| 久久蜜桃精品一区二区三区| 成人精品一区二区激情| 国产麻豆剧果冻传媒一区| 日本夜爽爽一区二区三区| 无码AV动漫精品一区二区免费| 一区二区福利视频| 中文字幕日韩一区二区三区不| 亚洲成AV人片一区二区密柚| 人体内射精一区二区三区| 亚洲熟妇无码一区二区三区 | 国产精品区一区二区三| 国产一区二区三区免费视频| 色狠狠一区二区三区香蕉蜜桃| 久久精品一区二区东京热|