合肥生活安徽新聞合肥交通合肥房產(chǎn)生活服務(wù)合肥教育合肥招聘合肥旅游文化藝術(shù)合肥美食合肥地圖合肥社保合肥醫(yī)院企業(yè)服務(wù)合肥法律

        代寫CIS5200、代做Java/Python程序語言
        代寫CIS5200、代做Java/Python程序語言

        時(shí)間:2024-11-01  來源:合肥網(wǎng)hfw.cc  作者:hfw.cc 我要糾錯(cuò)



        CIS5200: Machine Learning Fall 2024
        Homework 2
        Release Date: October 9, 2024 Due Date: October 18, 2024
        • HW2 will count for 10% of the grade. This grade will be split between the written (30 points)
        and programming (40 points) parts.
        • All written homework solutions are required to be formatted using LATEX. Please use the
        template here. Do not modify the template. This is a good resource to get yourself more
        familiar with LATEX, if you are still not comfortable.
        • You will submit your solution for the written part of HW2 as a single PDF file via Gradescope.
        The deadline is 11:59 PM ET. Contact TAs on Ed if you face any issues uploading your
        homeworks.
        • Collaboration is permitted and encouraged for this homework, though each student must
        understand, write, and hand in their own submission. In particular, it is acceptable for
        students to discuss problems with each other; it is not acceptable for students to look at
        another student’s written Solutions when writing their own. It is also not acceptable to
        publicly post your (partial) solution on Ed, but you are encouraged to ask public questions
        on Ed. If you choose to collaborate, you must indicate on each homework with whom you
        collaborated.
        Please refer to the notes and slides posted on the website if you need to recall the material discussed
        in the lectures.
        1 Written Questions (30 points)
        Problem 1: Gradient Descent (20 points)
        Consider a training dataset S = {(x1, y1), . . . ,(xm, ym)} where for all i ∈ [m], ∥xi∥2 ≤ 1 and
        yi ∈ {−1, 1}. Suppose we want to run regularized logistic regression, that is, solve the following
        optimization problem: for regularization term R(w),
        min
        w m
        1
        mX
        i=1
        log  1 + exp  −yiw
        ⊤xi
         + R(w)
        Recall: For showing that a twice differentiable function f is µ-strongly convex, it suffices to show
        that the hessian satisfies: ∇2f ⪰ µI. Similarly to show hat a twice differentiable function f is
        L-smooth, it suffices to show that the hessian satisfies: LI ⪰ ∇2f. Here I is the identity matrix of
        the appropriate dimension.
        1
        1.1 (3 points) In the case where R(w) = 0, we know that the objective is convex. Is it strongly
        convex? Explain your answer.
        1.2 (3 points) In the case where R(w) = 0, show that the objective is **smooth.
        1.3 (4 points) In the case of R(w) = 0, what is the largest learning rate that you can choose such
        that the objective is non-increasing at each iteration? Explain your answer.
        Hint: The answer is not 1/L for a L-smooth function.
        1.4 (1 point) What is the convergence rate of gradient descent on this problem with R(w) = 0?
        In other words, suppose I want to achieve F(wT +1) − F(w∗) ≤ ϵ, express the number of iterations
        T that I need to run GD for.
        Note: You do not need to reprove the convergence guarantee, just use the guarantee to provide the
        rate.
        1.5 (5 points) Consider the following variation of the ℓ2 norm regularizer called the weighted ℓ2
        norm regularizer: for λ1, . . . , λd ≥ 0,
        Show that the objective with R(w) as defined above is µ-strongly convex and L-smooth for µ =
        2 minj∈[d] λj and L = 1 + 2 maxj∈[d] λj .
        1.6 (4 points) If a function is µ-strongly convex and L-smooth, after T iterations of gradient
        descent we have:
        Using the above, what is the convergence rate of gradient descent on the regularized logistic re gression problem with the weighted ℓ2 norm penalty? In other words, suppose I want to achieve
        ∥wT +1 − w∗∥2 ≤ ϵ, express the number of iterations T that I need to run GD.
        Note: You do not need to prove the given convergence guarantee, just provide the rate.
        Problem 2: MLE for Linear Regression (10 points)
        In this question, you are going to derive an alternative justification for linear regression via the
        squared loss. In particular, we will show that linear regression via minimizing the squared loss is
        equivalent to maximum likelihood estimation (MLE) in the following statistical model.
        Assume that for given x, there exists a true linear function parameterized by w so that the label y
        is generated randomly as
        y = w
        ⊤x + ϵ
        2
        where ϵ ∼ N (0, σ2
        ) is some normally distributed noise with mean 0 and variance σ
        2 > 0. In other
        words, the labels of your data are equal to some true linear function, plus Gaussian noise around
        that line.
        2.1 (3 points) Show that the above model implies that the conditional density of y given x is
        P p(y|x) = 1.
        Hint: Use the density function of the normal distribution, or the fact that adding a constant to a
        Gaussian random variable shifts the mean by that constant.
        2.2 (2 points) Show that the risk of the predictor f(x) = E[y|x] is σ.
        2.3 (3 points) The likelihood for the given data {(x1, y1), . . . ,(xm, ym)} is given by.
        Lˆ(w, σ) = p(y1, . . . , ym|x1, . . . , xm) =
        Compute the log conditional likelihood, that is, log Lˆ(w, σ).
        Hint: Use your expression for p(y | x) from part 2.1.
        2.4 (2 points) Show that the maximizer of log Lˆ(w, σ) is the same as the minimizer of the empirical
        risk with squared loss, ˆR(w) = m
        Hint: Take the derivative of your result from 2.3 and set it equal to zero.
        2 Programming Questions (20 points)
        Use the link here to access the Google Colaboratory (Colab) file for this homework. Be sure to
        make a copy by going to “File”, and “Save a copy in Drive”. As with the previous homeworks, this
        assignment uses the PennGrader system for students to receive immediate feedback. As noted on
        the notebook, please be sure to change the student ID from the default ‘99999999’ to your 8-digit
        PennID.
        Instructions for how to submit the programming component of HW 2 to Gradescope are included
        in the Colab notebook. You may find this PyTorch linear algebra reference and this general
        PyTorch reference to be helpful in perusing the documentation and finding useful functions for
        your implementation.


        請(qǐng)加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp

        掃一掃在手機(jī)打開當(dāng)前頁
      1. 上一篇:代寫MMME4056、代做MATLAB編程設(shè)計(jì)
      2. 下一篇:CSCI 201代做、代寫c/c++,Python編程
      3. 無相關(guān)信息
        合肥生活資訊

        合肥圖文信息
        挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
        挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
        戴納斯帝壁掛爐全國(guó)售后服務(wù)電話24小時(shí)官網(wǎng)400(全國(guó)服務(wù)熱線)
        戴納斯帝壁掛爐全國(guó)售后服務(wù)電話24小時(shí)官網(wǎng)
        菲斯曼壁掛爐全國(guó)統(tǒng)一400售后維修服務(wù)電話24小時(shí)服務(wù)熱線
        菲斯曼壁掛爐全國(guó)統(tǒng)一400售后維修服務(wù)電話2
        美的熱水器售后服務(wù)技術(shù)咨詢電話全國(guó)24小時(shí)客服熱線
        美的熱水器售后服務(wù)技術(shù)咨詢電話全國(guó)24小時(shí)
        海信羅馬假日洗衣機(jī)亮相AWE  復(fù)古美學(xué)與現(xiàn)代科技完美結(jié)合
        海信羅馬假日洗衣機(jī)亮相AWE 復(fù)古美學(xué)與現(xiàn)代
        合肥機(jī)場(chǎng)巴士4號(hào)線
        合肥機(jī)場(chǎng)巴士4號(hào)線
        合肥機(jī)場(chǎng)巴士3號(hào)線
        合肥機(jī)場(chǎng)巴士3號(hào)線
        合肥機(jī)場(chǎng)巴士2號(hào)線
        合肥機(jī)場(chǎng)巴士2號(hào)線
      4. 幣安app官網(wǎng)下載 短信驗(yàn)證碼 丁香花影院

        關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

        Copyright © 2024 hfw.cc Inc. All Rights Reserved. 合肥網(wǎng) 版權(quán)所有
        ICP備06013414號(hào)-3 公安備 42010502001045

        主站蜘蛛池模板: 国产韩国精品一区二区三区久久| av无码人妻一区二区三区牛牛| 日本在线视频一区| 国产丝袜无码一区二区三区视频| 日韩一区二区超清视频| 国产精品一区二区久久国产| 国产在线无码视频一区| 人妻视频一区二区三区免费| 国产精品毛片一区二区三区| 国产福利一区二区在线视频 | 91久久精品国产免费一区| 亚洲国产成人久久一区WWW| 最新中文字幕一区二区乱码| 精品不卡一区二区| 午夜影院一区二区| 国模私拍福利一区二区| 精品无码人妻一区二区免费蜜桃| 精品一区二区三区中文字幕| 激情综合一区二区三区| 国产在线精品一区二区三区直播| 国产精品亚洲高清一区二区| 亚洲日本一区二区一本一道| 国产福利电影一区二区三区久久久久成人精品综合 | 亚洲国产成人久久综合一区| 免费视频一区二区| 欧洲精品无码一区二区三区在线播放| 国产一区二区不卡老阿姨| 无码人妻精品一区二区三区99仓本| 国产区精品一区二区不卡中文| 午夜无码一区二区三区在线观看 | 亚洲综合av一区二区三区| 亚洲一区精品无码| 久久se精品一区精品二区国产| 国产一区二区在线观看麻豆| 日韩人妻无码一区二区三区99| 国产美女精品一区二区三区| 亚洲福利一区二区| 91香蕉福利一区二区三区| 亚洲蜜芽在线精品一区| 无码人妻视频一区二区三区| 香蕉久久av一区二区三区|