Google Translate

2016年9月26日星期一

Ubuntu install lists


A list of post installation of Ubuntu


sudo apt-get install freeglut3-dev build-essential libx11-dev libxmu-dev libxi-dev libgl1-mesa-glx libglu1-mesa libglu1-mesa-dev

sudo apt-get update
sudo apt-get upgrade

Google Chrome






http://cn.soulmachine.me/2016-08-17-deep-learning-cuda-development-environment/
Driver:

sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt-get update
sudo apt-get install nvidia-375 (gforce) -377(quadro)

1. 'stop x server'
'Ctrl + Alt + F1' ===> terminal
sudo service lightdm stop
2. download and install driver
sudo sh NVIDIA.......





Cuda  #install
sudo sh cuda_8.0.27_linux.run --tmpdir=/tmp --override
sudo sh cuda_8.0.27.1_linux.run

~/.bashrc
export PATH=/usr/local/cuda-8.0/bin${PATH:+:${PATH}}
export LD_LIBRARY_PATH=/usr/local/cuda-8.0/lib64${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}}

#test
cd ~/NVIDIA_CUDA-8.0_Samples/1_Utilities/deviceQuery
make
./deviceQuery

cd ~/NVIDIA_CUDA-8.0_Samples/5_Simulations/nbody/
make
./nbody -benchmark -numbodies=256000 -device=0

Output:
> Windowed mode
> Simulation data stored in video memory
> Single precision floating point simulation
> 1 Devices used for simulation
gpuDeviceInit() CUDA Device [0]: "GeForce GTX 1080
> Compute 6.1 CUDA device: [GeForce GTX 1080]
number of bodies = 256000
256000 bodies, total time for 10 iterations: 2364.286 ms
= 277.192 billion interactions per second
= 5543.830 single-precision GFLOP/s at 20 flops per interaction
cuDNN
'copy ../cuda/include/*.h  to  /usr/local/cuda-8.0/include/'
'copy ../cuda/lib64/*.* to /usr/local/cuda-8.0/lib64'
**'/usr/local/cuda-8.0' is the path where CUDA installed.


Tensorflow
sudo apt-get install libcurl3-dev
$ conda install virtualenv # suggested by the output of terminal not using pip install
$ conda create --name=tensorflow_env python=3.5
$ source activate tensorflow_env
$ pip install --upgrade https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.5.0-cp27-none-linux_x86_64.whl # the code you provided is for the python 3





(to delete env: conda env remove --name tensorflow)

Rebuild tensorflow with CUDA!!!
https://alliseesolutions.wordpress.com/2016/09/08/install-gpu-tensorflow-from-sources-w-ubuntu-16-04-and-cuda-8-0-rc/

Note: 
1. Install protobuf 3.0!!MUST BE 3.0
pip install protobuf3
2. MUST downloads tensorflow from online, NOT get from the command line!!!
In build:
$ bazel build -c opt --config=cuda //tensorflow/tools/pip_package:build_pip_package
$ bazel-bin/tensorflow/tools/pip_package/build_pip_package ~/Downloads/tensorflow_pkg
$ sudo pip3 install ~/Downloads/tensorflow_pkg/tensorflow-0.10.0-py3-none-any.whl









######==========================================================================
Caffe:
http://caffe.berkeleyvision.org/installation.html#prerequisites

In compiling opencv, BEFORE 'make':
In file 'modules/cudalegacy/src/graphcuts.cpp' change:
-#if !defined (HAVE_CUDA) || defined (CUDA_DISABLER)
+// GraphCut has been removed in NPP 8.0
+#if !defined (HAVE_CUDA) || defined (CUDA_DISABLER) || (CUDART_VERSION >= 8000)

prereqisites:
1. protobuf
sudo apt-get install autoconf automake libtool curl make g++ unzip
2. glog, etc..
sudo apt-get install libgflags-dev libgoogle-glog-dev liblmdb-dev
3. hdf5
sudo apt-get install hdf5-tools
cd /usr/lib/x86_64-linux-gnu
sudo ln -s libhdf5_serial.so.10.1.0 libhdf5.so
sudo ln -s libhdf5_serial_hl.so.10.0.2 libhdf5_hl.so

Changes in Makefile.config
Uncomment 'OPENCV_VERSION := 3', 'USE_CUDNN := 1', 'ANACONDA_HOME := $(HOME)/anaconda3/envs/caffe/', 'PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
  # $(ANACONDA_HOME)/include/python3.5 \
  # $(ANACONDA_HOME)/lib/python3.5/site-packages/numpy/core/include \', 'USE_PKG_CONFIG := 1'

(BEFORE 'make all' for caffe)
find . -type f -exec sed -i -e 's^"hdf5.h"^"hdf5/serial/hdf5.h"^g' -e 's^"hdf5_hl.h"^"hdf5/serial/hdf5_hl.h"^g' '{}' \;
mkdir build
cd build
cmake ..
make all      (Or cmake -DWITH_IPP=ON . && make -j $(nproc) && make install???)
make install
make runtest



C++
CodeBlocks:
sudo add-apt-repository ppa:damien-moore/codeblocks-stable
sudo apt-get update
Anaconda

Pycharm:
sudo add-apt-repository ppa:mystic-mirage/pycharm
sudo apt-get update
sudo apt-get install pycharm-comminity
Notepad++:
sudo add-apt-repository ppa:notepadqq-team/notepadqq
sudo apt-get update
sudo apt-get install notepadqq


2016年3月16日星期三

Install Pyspark in Windows and with PyCharm!!!

Build Spark environment for Windows command line.
Paths added in this session are for Win environment by default.

1. Install java JDK to ..\Java\jdk1.8.0_74

Add path ..\Java\jdk1.8.0_74;..\Java\jdk1.8.0_74\bin


2. Install scala to ..\scala

Add path ..\scala\bin


3. DONOT!!! Install Python 3

3. Install Anaconda3 and ADD PATH the most front!!



4. Download the latest prebuilt Spark to ..\Spark\spark-1.6.1

Add path:  ..\Spark\spark-1.6.1;..\Spark\spark-1.6.1\bin


5. SPARK_HOME = ..\Spark\spark-1.6.1

    Download hadoop "winutils.exe" etc. files.

    HADOOP_HOME  =  ../hadoop/hadoop-common-2.2.0-bin-master

Now java, scala, and spark can all be run in command line in windows.

======================================================================
Next, go further for PyCharm IDE
Paths added in this session are for "current-project" settings in Run\EditConfigurations.

6. Install PyCharm


7. Add Environment Variables in PyCharm IDE
PYTHONPATH  =  ../Spark/spark-1.6.1/python;/Spark/spark-1.6.1/python/lib/py4j-0.9-src.zip;..\hadoop\hadoop-common-2.2.0-bin-master\bin

HADOOP_HOME  =  ../hadoop/hadoop-common-2.2.0-bin-master

SPARK_HOME = Spark/spark-1.6.1


=======================================================================
To be further tested and simplified!