For marketers
who love technology
Home » » Learning Hadoop with IBM's bigdatauniversity - how to solve download problems for VMware images under Linux

Learning Hadoop with IBM's bigdatauniversity - how to solve download problems for VMware images under Linux

If you are following the http://bigdatauniversity.com courses, there are chance you have wanted to try the "Hadoop fundamentals" labs. They require a large  VMWare image supporting Hadoop cluster deployment.

Problems: First, the most recent files support only single node deployments, so you have to look for older versions on IBM website. Second, IBM servers are flappy and the virtual machine images are >7Gbyte large: it was impossible for me to  download those files completely, with the Java applet, and as an html download.

The solution for me was to copy the unique download URL after the download had failed, then continue the download with wget, using the -t options which permits to set the number of times the download should be retried in case of error.

 Concretely, I have used the following command:
 wget -t inf --continue https://iwm.dhe.ibm.com/(...)/iibi3002_QuickStart_Cluster_OVF.7z

And it should work for you too, you just have to change the URL. If this post helped you, share it ! The more links it gets, the more visible it will be from search engines.

SHARE

About Gilles

0 comments :

Post a Comment