Update: 6-16-2016
>>点击Win7升级Wini10<< (5/?) 点击“立即升级” - “保存”(任意位置或者桌面) - 然后点击浏览器左下角点开下载的 - 是 - (正在检查更新)- 然后就自动下载更新
when you cannot start mongod: ps aux | grep mongod, then kill the ID contains mongo
Click to download.new_ubuntu_OS_setup_file (GNOME 16.04)
Ubuntu 16.04: How to Move Unity’s Left Launcher to The Bottom
gsettings set com.canonical.Unity.Launcher launcher-position Bottom
revert back:
gsettings set com.canonical.Unity.Launcher launcher-position Left
Mac Softwares (Not in the App Store)
Download
>>Scroll Reveser (set different direction for Mouse and Trackpad)<< |
>>Go2Shell (Website)<< |
>>MacTex<< |
>>Python<< |
>>Python3.8.8<< |
>>Python2.7.18<< |
>><Anaconda<< |
>>Nodejs<< |
>>KugouMusic<< |
>>Github<< |
>>Sublime Text 2<< |
>>Sublime Text 3<< |
>>Atom<< |
>>Visual Studio Code<< |
>>MongoHub<< |
>>Android File Transfer<< |
>>Logitech Options<< |
>>Firefox<< |
>>Chrome<< |
>>VLC<< |
>>TeamViewer<< |
>>Paintbrush2<< |
>>MySQL Workbench<< |
>>Java<< |
>>JDK8<< |
>>PyCharm<< |
>>JetBrain Student Reg<< |
>>VirtualBox<< |
>>Dropbox<< |
>>name<< |
Other Linux Softwares
Dropbox |
有道词典 |
Sublime Text 3(S/N) |
Chrome |
WPS |
Slack |
JetBrains Products
list pods
kubectl get pods
kubectl get pods -n [partial-pods-name]
create a pod
kubectl create -f /[dir]/[filename].yml
connect to pod
kubectl exec -t -i [pod name] -- bash
delete a pod
kubectl delete pod [pod name]
S3 operations
aws s3 ls/cp/rm/mv | more
Search keywords in local files
grep -Hr "Message" . /
"Breif Instruction"
hadoop HDFS常用文件操作命令
"ls: check files or details"
hadoop fs -ls /
ehadoop fs -ls -R /
"put: local to hdfs"
hadoop fs -put < local file > < hdfs file > (hdfs file cannot exist)
"get: hdfs to local"
hadoop fs -cat < hdfs file > < local file or dir>
"cat: look at the file"
hadoop fs -get < hdfs file > < local file or dir>
"mkdir: create a new dir"
hadoop fs -mkdir < hdfs file > < local file or dir>
"rm: remove files/folders from hdlf"
hadoop fs -rm < hdfs file > ...
hadoop fs -rm -r < hdfs dir>...
"du: size of files"
hadoop fs -du < hdsf path>
hadoop fs -du -s < hdsf path>
hadoop fs -du -h < hdsf path>
"text: text output of files"
hadoop fs -text < hdsf file>
"Run on pyspark 3.0 with graphframes 0.8.0"
$SPARK_HOME/bin/pyspark --packages graphframes:graphframes:0.8.0-spark3.0-s_2.12
To install package type "/Users/USERNAME/anaconda3/bin/pip install ir_evaluation_py"
"install pyspark"
PySpark on macOS: installation and use
"vi ~/.bashrc"
export PATH=/Users/USERNAME/anaconda3/bin:$PATH
export JAVA_HOME=/Library/java/JavaVirtualMachines/jdk1.8.0_162.jdk/contents/Home/
export JRE_HOME=/Library/java/JavaVirtualMachines/openjdk-14.0.1.jdk/Contents/Home/
export SPARK_HOME=/usr/local/Cellar/apache-spark/3.0.0/libexec
export PATH=/usr/local/Cellar/apache-spark/3.0.0/bin:$PATH
export PYSPARK_PYTHON=/usr/local/bin/python3
export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS='notebook'
"install Anaconda3"
Official Download
Install on Mac
"source Anaconda3"
vi ~/.bashrc
export PATH=/Users/USERNAME/anaconda3/bin:$PATH
"show disk useage"
du -h --max-depth=0 * | sort -hr
df
"save output(yarn) to text"
python xxxx.py &> output.txt
yarn logs --applicationId xxx &> logs.txt
"initialize MongoDB"
sudo mkdir -p /data/db
sudo chmod g+w /data/db
sudo mongod &
mongo
"Export":
exit mongodb
mongoexport --db test --collection traffic --out sample.json
"Import":
exit mongodb
mongoimport --db test --collection traffic --file sample.json
"find one":
db.collection_name.find()
"find sth":
db.collection_name.find({field_name: "value"})
"drop":
db.dropDatabase() / db.collection_name.drop()
"count":
db.collection_name.find().count()
"distinct":
db.collection_name.distinct("field_name")
"delete":
db.collection_name.remove({field_name: "value"})
"update & push":
git pull
git add * / git rm filename.type
git commit -m "sth"
git push
Delete the folder locally and then push to github ex:
rm -rf folder
git add .
git commit -a -m "removed folder"
git push origin master
Click to download SIGGRAPH demo files done
Copyright © Jinda Han. 2015-2021. All Rights Reserved.