99爱在线视频这里只有精品_窝窝午夜看片成人精品_日韩精品久久久毛片一区二区_亚洲一区二区久久

合肥生活安徽新聞合肥交通合肥房產(chǎn)生活服務(wù)合肥教育合肥招聘合肥旅游文化藝術(shù)合肥美食合肥地圖合肥社保合肥醫(yī)院企業(yè)服務(wù)合肥法律

代寫EMS5730、代做Python設(shè)計程序

時間:2024-02-14  來源:合肥網(wǎng)hfw.cc  作者:hfw.cc 我要糾錯



EMS5**0 Spring 2024 Homework #0
Release date: Jan 10, 2024
Due date: Jan 21, 2024 (Sunday) 23:59 pm
(Note: The course add-drop period ends at 5:30 pm on Jan 22.)
No late homework will be accepted!
Every Student MUST include the following statement, together with his/her signature in the
submitted homework.
I declare that the assignment submitted on the Elearning system is
original except for source material explicitly acknowledged, and that the
same or related material has not been previously submitted for another
course. I also acknowledge that I am aware of University policy and
regulations on honesty in academic work, and of the disciplinary
guidelines and procedures applicable to breaches of such policy and
regulations, as contained in the website
Submission notice:
● Submit your homework via the elearning system
General homework policies:
A student may discuss the problems with others. However, the work a student turns in must
be created COMPLETELY by oneself ALONE. A student may not share ANY written work or
pictures, nor may one copy answers from any source other than one’s own brain.
Each student MUST LIST on the homework paper the name of every person he/she has
discussed or worked with. If the answer includes content from any other source, the
student MUST STATE THE SOURCE. Failure to do so is cheating and will result in
sanctions. Copying answers from someone else is cheating even if one lists their name(s) on
the homework.
If there is information you need to solve a problem but the information is not stated in the
problem, try to find the data somewhere. If you cannot find it, state what data you need,
make a reasonable estimate of its value and justify any assumptions you make. You will be
graded not only on whether your answer is correct, but also on whether you have done an
intelligent analysis.
Q0 [10 marks]: Secure Virtual Machines Setup on the Cloud
In this task, you are required to set up virtual machines (VMs) on a cloud computing
platform. While you are free to choose any cloud platform, Google Cloud is recommended.
References [1] and [2] provide the tutorial for Google Cloud and Amazon AWS, respectively.
The default network settings in each cloud platform are insecure. Your VM can be hacked
by external users, resulting in resource overuse which may charge your credit card a
big bill of up to $5,000 USD. To protect your VMs from being hacked and prevent any
financial losses, you should set up secure network configurations for all your VMs.
In this part, you need to set up a whitelist for your VMs. You can choose one of the options
from the following choices to set up your whitelist: 1. only the IP corresponding to your
current device can access your VMs via SSH. Traffic from other sources should be blocked.
2. only users in the CUHK network can access your VMs via SSH. Traffic outside CUHK
should be blocked. You can connect to CUHK VPN to ensure you are in the CUHK network
(IP Range: 137.189.0.0/16). Reference [3] provides the CUHK VPN setup information from
ITSC.
a. [10 marks] Secure Virtual Machine Setup
Reference [4] and [5] are the user guides for the network security configuration of
AWS and Google Cloud, respectively. You can go through the document with respect
to the cloud platform you use. Then follow the listed steps to configure your VM’s
network:
i. locate or create the security group/ firewall of your VM;
ii. remove all rules of inbound/ ingress and outbound/ egress, except for the
default rule(s) responsible for internal access within the cloud platform;
iii. add a new rule to the inbound/ ingress, with the SSH port(s) of VMs (default:
22) and source specified, e.g., ‘137.189.0.0/16’ for CUHK users only;
iv. (Optional) more ports may be further permitted based on your needs (e.g.,
when completing Q1 below).
Q1 [** marks + 20 bonus marks]: Hadoop Cluster Setup
Hadoop is an open-source software framework for distributed storage and processing. In this
problem, you are required to set up a Hadoop cluster using the VMs you instantiated in Q0.
In order to set up a Hadoop cluster with multiple virtual machines (VM), you can set up a
single-node Hadoop cluster for each VM first [6]. Then modify the configuration file in each
node to set up a Hadoop cluster with multiple nodes. References [7], [9], [10], [11] provide
the setup instructions for a Hadoop cluster. Some important notes/ tips on instantiating VMs
are given at the end of this section.
a. [20 marks] Single-node Hadoop Setup
In this part, you need to set up a single-node Hadoop cluster in a pseudo-distributed
mode and run the Terasort example on your Hadoop cluster.
i. Set up a single-node Hadoop cluster (recommended Hadoop version: 2.9.x,
all versions available in [16]). Attach the screenshot of http://localhost:50070
(or http://:50070 if opened in the browser of your local machine) to
verify that your installation is successful.
ii. After installing a single-node Hadoop cluster, you need to run the Terasort
example [8] on it. You need to record all your key steps, including your
commands and output. The following commands may be useful:
$ ./bin/hadoop jar \
./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.9.2.jar \
teragen 120000 terasort/input
//generate the data for sorting
$ ./bin/hadoop jar \
./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.9.2.jar \
terasort terasort/input terasort/output
//terasort the generated data
$ ./bin/hadoop jar \
./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.9.2.jar \
teravalidate terasort/output terasort/check
//validate the output is sorted
Notes: To monitor the Hadoop service via Hadoop NameNode WebUI (http://ip>:50070) on your local browser, based on steps in Q0, you may further allow traffic
from CUHK network to access port 50070 of VMs.
b. [40 marks] Multi-node Hadoop Cluster Setup
After the setup of a single-node Hadoop cluster in each VM, you can modify the
configuration files in each node to set up the multi-node Hadoop cluster.
i. Install and set up a multi-node Hadoop cluster with 4 VMs (1 Master and 3
Slaves). Use the ‘jps’ command to verify all the processes are running.
ii. In this part, you need to use the ‘teragen’ command to generate 2 different
datasets to serve as the input for the Terasort program. You should use the
following two rules to determine the size of the two datasets of your own:
■ Size of dataset 1: (Your student ID % 3 + 1) GB
■ Size of dataset 2: (Your student ID % 20 + 10) GB
Then, run the Terasort code again for these two different datasets and
compare their running time.
Hints: Keep an image for your Hadoop cluster. You would need to use the Hadoop
cluster again for subsequent homework assignments.
Notes:
1. You may need to add each VM to the whitelist of your security group/ firewall
and further allow traffic towards more ports needed by Hadoop/YARN
services (reference [17] [18]).
2. For step i, the resulting cluster should consist of 1 namenode and 4
datanodes. More precisely, 1 namenode and 1 datanode would be running on
the master machine, and each slave machine runs one datanode.
3. Please ensure that after the cluster setup, the number of “Live Nodes” shown
on Hadoop NameNode WebUI (port 50070) is 4.
c. [30 marks] Running Python Code on Hadoop
Hadoop streaming is a utility that comes with the Hadoop distribution. This utility
allows you to create and run MapReduce jobs with any executable or script as the
mapper and/or the reducer. In this part, you need to run the Python wordcount script
to handle the Shakespeare dataset [12] via Hadoop streaming.
i. Reference [13] introduces the method to run a Python wordcount script via
Hadoop streaming. You can also download the script from the reference [14].
ii. Run the Python wordcount script and record the running time. The following
command may be useful:
$ ./bin/hadoop jar \
./share/hadoop/tools/lib/hadoop-streaming-2.9.2.jar \
-file mapper.py -mapper mapper.py \
-file reducer.py -reducer reducer.py \
-input input/* \
-output output
//submit a Python program via Hadoop streaming
d. [Bonus 20 marks] Compiling the Java WordCount program for MapReduce
The Hadoop framework is written in Java. You can easily compile and submit a Java
MapReduce job. In this part, you need to compile and run your own Java wordcount
program to process the Shakespeare dataset [12].
i. In order to compile the Java MapReduce program, you may need to use
“hadoop classpath” command to fetch the list of all Hadoop jars. Or you can
simply copy all dependency jars in a directory and use them for compilation.
Reference [15] introduces the method to compile and run a Java wordcount
program in the Hadoop cluster. You can also download the Java wordcount
program from reference [14].
ii. Run the Java wordcount program and compare the running time with part c.
Part (d) is a bonus question for IERG 4300 but required for ESTR 4300.
IMPORTANT NOTES:
1. Since AWS will not provide free credits anymore, we recommend you to use Google
Cloud (which offers a **-day, $300 free trial) for this homework.
2. If you use Putty for SSH client, please download from the website
https://www.putty.org/ and avoid using the default private key. Failure to do so will
subject your AWS account/ Hadoop cluster to hijacking.
3. Launching instances with Ubuntu (version >= 18.04 LTS) is recommended. Hadoop
version 2.9.x is recommended. Older versions of Hadoop may have vulnerabilities
that can be exploited by hackers to launch DoS attacks.
4. (AWS) For each VM, you are recommended to use the t2.large instance type with
100GB hard disk, which consists of 2 CPU cores and 8GB RAM.
5. (Google) For each VM, you are recommended to use the n2-standard-2 instance
type with 100GB hard disk, which consists of 2 CPU cores and 8GB RAM.
6. When following the given references, you may need to modify the commands
according to your own environment, e.g., file location, etc.
7. After installing a single-node Hadoop, you can save the system image and launch
multiple copies of the VM with that image. This can simplify your process of installing
the single-node Hadoop cluster on each VM.
8. Keep an image for your Hadoop cluster. You will need to use the Hadoop cluster
again for subsequent homework assignments.
9. Always refer to the logs for debugging single/multi-node Hadoop setup, which
contains more details than CLI outputs.
10. Please shut down (not to terminate) your VMs when you are not using them. This can
save you some credits and avoid being attacked when your VMs are idle.
Submission Requirements:
1. Include all the key steps/ commands, your cluster configuration details, source codes
of your programs, your compiling steps (if any), etc., together with screenshots, into a
SINGLE PDF report. Your report should also include the signed declaration (the first
page of this homework file).
2. Package all the source codes (as you included in step 1) into a zip file individually.
3. You should submit two individual files: your homework report (in PDF format) and a
zip file packaged all the codes of your homework.
4. Please submit your homework report and code zip file through the Blackboard
system. No email submission is allowed.
如有需要,請加QQ:99515681 或WX:codehelp

掃一掃在手機打開當(dāng)前頁
  • 上一篇:代做CSCI3280、Python設(shè)計編程代寫
  • 下一篇:代寫CS 476/676 程序
  • 無相關(guān)信息
    合肥生活資訊

    合肥圖文信息
    2025年10月份更新拼多多改銷助手小象助手多多出評軟件
    2025年10月份更新拼多多改銷助手小象助手多
    有限元分析 CAE仿真分析服務(wù)-企業(yè)/產(chǎn)品研發(fā)/客戶要求/設(shè)計優(yōu)化
    有限元分析 CAE仿真分析服務(wù)-企業(yè)/產(chǎn)品研發(fā)
    急尋熱仿真分析?代做熱仿真服務(wù)+熱設(shè)計優(yōu)化
    急尋熱仿真分析?代做熱仿真服務(wù)+熱設(shè)計優(yōu)化
    出評 開團工具
    出評 開團工具
    挖掘機濾芯提升發(fā)動機性能
    挖掘機濾芯提升發(fā)動機性能
    海信羅馬假日洗衣機亮相AWE  復(fù)古美學(xué)與現(xiàn)代科技完美結(jié)合
    海信羅馬假日洗衣機亮相AWE 復(fù)古美學(xué)與現(xiàn)代
    合肥機場巴士4號線
    合肥機場巴士4號線
    合肥機場巴士3號線
    合肥機場巴士3號線
  • 短信驗證碼 目錄網(wǎng) 排行網(wǎng)

    關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網(wǎng) 版權(quán)所有
    ICP備06013414號-3 公安備 42010502001045

    99爱在线视频这里只有精品_窝窝午夜看片成人精品_日韩精品久久久毛片一区二区_亚洲一区二区久久

          9000px;">

                国产精品久久毛片| 国产精品18久久久久久久网站| 一区二区免费看| 日韩高清一区二区| 国产激情精品久久久第一区二区 | 91精品欧美一区二区三区综合在| 久久精品在线免费观看| 亚洲午夜羞羞片| 国产91在线看| 91麻豆精品国产91久久久久久久久| 国产午夜精品一区二区三区视频| 亚洲在线视频网站| 国产成a人亚洲| 3d动漫精品啪啪一区二区竹菊| 国产精品乱人伦中文| 乱中年女人伦av一区二区| 91社区在线播放| 国产日韩欧美电影| 青娱乐精品视频| 欧美中文字幕不卡| 亚洲视频在线一区| 国产成人午夜高潮毛片| 91精品国产综合久久香蕉麻豆 | 久久精品夜夜夜夜久久| 日本午夜精品视频在线观看| 在线国产亚洲欧美| 亚洲色图色小说| 国产成人av电影在线观看| 日韩欧美国产综合一区| 亚洲第一av色| 色婷婷av一区二区三区软件| 欧美国产国产综合| 国产一区二区三区av电影| 日韩一区二区精品| 天堂精品中文字幕在线| 欧美少妇bbb| 亚洲电影在线播放| 91福利在线看| 一区二区高清免费观看影视大全| 丁香亚洲综合激情啪啪综合| 精品国内二区三区| 国产一区二区三区免费播放| 2021中文字幕一区亚洲| 久久电影国产免费久久电影 | 欧美三级视频在线| 亚洲第一在线综合网站| 欧美视频中文字幕| 视频在线观看91| 欧美一区二区成人| 久久超碰97中文字幕| 久久精品水蜜桃av综合天堂| 国产美女一区二区| 欧美激情在线看| www.激情成人| 夜夜精品浪潮av一区二区三区| 欧美在线999| 亚洲成a人v欧美综合天堂| 欧美精品久久久久久久久老牛影院 | 成人精品高清在线| 亚洲视频在线一区观看| 欧美亚洲一区二区三区四区| 日韩成人一区二区| 久久久久久麻豆| 99在线精品视频| 五月天精品一区二区三区| 日韩欧美国产一二三区| 国产乱子轮精品视频| 国产精品大尺度| 欧美色图第一页| 国内精品伊人久久久久av一坑| 中文字幕一区二区三区四区不卡 | 亚洲综合色自拍一区| 欧美一级专区免费大片| 国产成人在线免费| 一区二区高清免费观看影视大全 | 国产精品色噜噜| 欧美午夜在线观看| 国产一区在线不卡| 亚洲精品精品亚洲| 欧美videossexotv100| 91在线观看一区二区| 亚洲一区二区三区四区在线免费观看| 精品日韩av一区二区| 色综合久久久久综合体桃花网| 日本va欧美va精品| 日韩美女精品在线| 欧美精品一区二区三区一线天视频| eeuss鲁片一区二区三区| 日本女人一区二区三区| 亚洲男同性恋视频| 国产欧美一区二区精品婷婷| 777xxx欧美| 色综合天天综合狠狠| 国产美女在线精品| 视频一区二区三区入口| 亚洲欧美一区二区不卡| 久久精品日韩一区二区三区| 欧美肥胖老妇做爰| 日本精品裸体写真集在线观看| 国产99久久久国产精品潘金网站| 石原莉奈在线亚洲三区| 日韩久久一区二区| 国产日韩欧美a| 精品久久人人做人人爽| 在线成人午夜影院| 欧美私模裸体表演在线观看| 91蝌蚪porny| 99久久精品国产一区| 国产很黄免费观看久久| 精品在线一区二区| 欧美bbbbb| 美洲天堂一区二卡三卡四卡视频| 亚洲午夜私人影院| 亚洲午夜久久久久久久久久久| 综合色中文字幕| 国产精品第四页| 1024成人网色www| 最好看的中文字幕久久| 日韩理论片一区二区| 亚洲女同一区二区| 亚洲一区二区在线播放相泽| 一区二区三区在线高清| 一区二区三区在线不卡| 亚洲最新视频在线播放| 亚洲午夜久久久久久久久久久| 亚洲一区二区三区三| 亚洲最色的网站| 婷婷久久综合九色综合伊人色| 无吗不卡中文字幕| 日韩av中文在线观看| 青青草国产成人99久久| 免费在线观看成人| 国产一区二区成人久久免费影院| 国产精品香蕉一区二区三区| 成人一区二区三区视频| 成人av免费观看| 欧美亚洲丝袜传媒另类| 91精品国产一区二区三区蜜臀 | 欧美日本韩国一区| 91精品视频网| 久久亚区不卡日本| 国产精品久久夜| 一区二区三区精品| 日韩精品乱码免费| 精品一区二区国语对白| 成人av网站在线观看免费| 在线视频中文字幕一区二区| 56国语精品自产拍在线观看| 久久久www成人免费无遮挡大片 | 久久蜜桃av一区精品变态类天堂| 国产精品私人影院| 亚洲成人激情综合网| 国内精品在线播放| 日本韩国欧美一区二区三区| 欧美成人综合网站| 一区二区在线免费观看| 免费xxxx性欧美18vr| 97精品电影院| 精品美女在线观看| 亚洲欧美日韩在线不卡| 日本美女一区二区三区视频| 成人黄色a**站在线观看| 欧美日韩在线一区二区| 国产亚洲精品7777| 日韩精品亚洲一区二区三区免费| 国产传媒欧美日韩成人| 欧美日本在线播放| 中文字幕日韩一区| 老司机免费视频一区二区| 99re热这里只有精品视频| 日韩欧美黄色影院| 亚洲动漫第一页| 成人av在线播放网站| 欧美不卡一区二区三区| 亚洲一区二区3| 暴力调教一区二区三区| 欧美v日韩v国产v| 午夜一区二区三区在线观看| 成人小视频在线| 亚洲精品在线观看视频| 亚洲国产精品影院| 色婷婷久久久亚洲一区二区三区| 久久日一线二线三线suv| 香蕉久久夜色精品国产使用方法| 99精品国产热久久91蜜凸| 久久精品夜夜夜夜久久| 韩国精品久久久| 日韩一区二区三区在线| 亚洲va在线va天堂| 在线亚洲免费视频| 伊人开心综合网| 色噜噜狠狠成人中文综合| 中文字幕永久在线不卡| 国产高清在线观看免费不卡| 亚洲精品一区在线观看| 精品亚洲免费视频| 日韩欧美精品在线视频| 免费亚洲电影在线| 日韩欧美在线综合网| 国内精品写真在线观看|