Partially covered:
Django
Python
Docker
GIT
Linux
1.1. Create a VS Code Editor from web (press
.
o open when on main Repository page)2.1.1.1. Create a git patch from the uncommitted changes in the current working directory
2.1.1.9. Diff of what has changed between staged changes and the last commit
2.1.2.1. List all branches. The current one is marked with *
2.1.3.1. Fetch and merge all commits from the tracked remote branch
2.1.3.2. Fetch and merge all commits from a specific remote branch
2.1.3.3. Fetch recent changes from the tracked remote branch but don't merge them
2.1.3.4. Push all local branch commits to the tracked remote branch
2.1.3.5. Push all local branch commits to a specific remote branch
2.1.3.7. Display a list of remote repositories and their URLs
2.1.4.2. Show all commits in the current branch’s history by printing each commit on a single line
2.1.4.3. Show number of commits per author on all branches, excluding merge commits
2.1.4.4. Show number of commits per author on a branch, excluding merge commits
2.1.4.5. Show number of commits per author on all branches, including merge commits
2.1.4.6. Show number of commits per author on a branch, including merge commits
2.1.5.1. Reapply commits from the current branch on top of another base
2.1.6.1. Revert the changes in a commit and record them in a new commit
2.1.6.2. Reset to a previous commit and preserve the changes made since [commit] as unstaged
2.1.6.3. Reset to a previous commit and discard the changes made since the [commit]
2.1.7.2. Stash modified and staged changes with a custom message
2.1.7.5. Restore the most recently stashed changeset and delete it
2.1.9.3. Clone a repository and add it to the current folder
Include in the current project files from the previews pushes of files that became larger then 100MB
4.6.1.2. Make a view accept single or multiple posts to database
9.2.7. To stop all Docker containers, simply run the following command in your terminal
10.1.1. Update a table that have a column with a null value with a value
str
int
, float
, complex
list
, tuple
, range
dict
set
, frozenset
bool
bytes
, bytearray
, memoryview
.
o open when on main Repository page)Bellow are example of how to create a vscode editor from web. IF you connect your account and synchronize it will load your settings.
If you haven't yet committed the changes, then:
git diff > mypatch.patch
But sometimes it happens that part of the stuff you're doing are new files that are untracked and won't be in your git diff output. So, one way to do a patch is to stage everything for a new commit (git add each file
, or just git add .
) but don't do the commit, and then:
git diff --cached > mypatch.patch
Add the 'binary' option if you want to add binary files to the patch (e.g. mp3 files):
git diff --cached --binary > mypatch.patch
You can later apply the patch:
git apply mypatch.patch
git status
git add [file_name]
git add [folder_name]
git commit -m "descriptive_message"
git commit -am "descriptive_message"
git reset [file_name]
git diff
git diff --staged
git branch
git branch [branch_name]
git checkout [branch_name]
git checkout -b [branch_name]
git checkout -
git checkout -m [new_branch]
git branch -d [branch_name]
git merge [branch_name]
git pull
git pull [alias] [branch_name]
git fetch
git push
git push [alias] [branch_name]
git remote add [alias] [repo_url]
git remote -v
git log
git log --oneline
git shortlog -s -n --all --no-merges
git shortlog -s -n [branch_name] --no-merges
git shortlog -s -n --all
git shortlog -s -n [branch_name]
git rebase [branch_name]
git rebase –-abort
git rebase –-continue
git revert [commit]
git reset [commit]
git reset --hard [commit]
git stash
git stash push -m "message"
git stash push src/custom.css
git stash list
git stash pop
git stash drop
git tag "tagname"
git tag
git tag -d "tagname"
git init
git init [folder_name]
git clone [repo_url]
git clone [repo_url] [folder_name]
git config --global user.name "user_name"
git config --global user.email "user_email"
git config --global color.ui auto
GIT=$(git rev-parse --show-toplevel)
cd $GIT/..
rm -rf $GIT
git clone ...
git clean --force -d -x
git reset --hard
git clean --force -d -x
git reset --hard
git branch -m master develop
git fetch origin
git branch -u origin/develop develop
git remote set-head origin -a
Documentation: https://git-lfs.github.com/
This will include all csv files
git lfs migrate import --include="*.csv"
Link to the cheatsheet:
str
int
, float
, complex
list
, tuple
, range
from functools import reduce
nested_list = [[1], [2], [3, 5, 6], [4, 3, 12, 33]]
simple_list = reduce(lambda x,y: x+y, nested_list)
dict
set
, frozenset
bool
bytes
, bytearray
, memoryview
Generate EAN 13 barcode for products.
Formula used (5940000 + random_number + last digit validator)
EAN country codes : https://wholesgame.com/trade-info/ean-barcodes-country/
python-barcode==0.13.1
XlsxWriter8: Only for generating NEW files.
pip install XlsxWriter
This is testws.py
!/usr/bin/env python3
# testws.py
import sys, json
import asyncio
from websockets import connect
class EchoWebsocket:
async def __aenter__(self):
self._conn = connect('wss://ws.binaryws.com/websockets/v3')
self.websocket = await self._conn.__aenter__()
return self
async def __aexit__(self, *args, **kwargs):
await self._conn.__aexit__(*args, **kwargs)
async def send(self, message):
await self.websocket.send(message)
async def receive(self):
return await self.websocket.recv()
class mtest:
def __init__(self):
self.wws = EchoWebsocket()
self.loop = asyncio.get_event_loop()
def get_ticks(self):
return self.loop.run_until_complete(self.__async__get_ticks())
async def __async__get_ticks(self):
async with self.wws as echo:
await echo.send(json.dumps({'ticks_history': 'R_50', 'end': 'latest', 'count': 1}))
return await echo.receive()
And this in main.py:
# main.py
from testws import *
a = mtest()
foo = a.get_ticks()
print (foo)
print ("async works like a charm!")
foo = a.get_ticks()
print (foo)
This is the output:
root@ubupc1:/home/dinocob# python3 test.py
{"count": 1, "end": "latest", "ticks_history": "R_50"}
async works like a charm!
{"count": 1, "end": "latest", "ticks_history": "R_50"}
This include also generation for database models using graphwiz.
https://github.com/django-extensions/django-extensions
## Default Authentication
REST_FRAMEWORK = {
'DEFAULT_PERMISSION_CLASSES': [
'rest_framework.permissions.IsAuthenticated',
],
'DEFAULT_AUTHENTICATION_CLASSES': [ # new
'rest_framework.authentication.SessionAuthentication',
'rest_framework.authentication.BasicAuthentication',
'rest_framework.authentication.TokenAuthentication', # new
], }
pipenv install dj-rest-auth==1.1.0
# config/settings.py
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
# 3rd-party apps
'rest_framework',
'rest_framework.authtoken', # new
'dj_rest_auth', # new
#config/urls.py
urlpatterns += [
path('api-auth/', include('rest_framework.urls')),
path('api/v1/dj-rest-auth/', include('dj_rest_auth.urls')), # new
]
pipenv install django-allauth
config/settings.py
INSTALLED_APPS += [
'django.contrib.sites', # new
# 3rd-party apps
'rest_framework.authtoken',
'allauth', # new
'allauth.account', # new
'allauth.socialaccount', # new
'dj_rest_auth.registration', # new
]
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend' # new
SITE_ID = 1 # new
urlpatterns += [
path('api/v1/dj-rest-auth/registration/', include('dj_rest_auth.registration.urls')), # new
]
=======
class ARedirectApiView(APIView):
""" an example of APIView that take an optional parameter and redirect to the path provided"""
def get(self, request, *args, **kwargs):
"""override get method for the APIView"""
ean_code = self.request.query_params["ean_code"] # a random optional parameter
if ean_code:
image_location_in_media = "test.jpeg"
else:
image_location_in_media = "default_image.jpeg"
host = request.get_host() # host is the domain name
redirect_path = f"http://{host}/media/{image_location_in_media}"
return HttpResponseRedirect(redirect_to=redirect_path)
This is an example for a view that accepts single(dict) or multiple(list of dict) post to the database.
class ProductViewset(viewsets.ModelViewSet):
"""Product Viewset
API endpoint that allows products to be edited.
Allowed actions:
"POST", "PUT"
"""
queryset = models.Product.objects.all()
serializer_class = serializers.ProductSerializer
http_method_names = ['post', 'put']
# Method that allow multiple or single product to be posted on database
def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data, many=isinstance(request.data,list))
serializer.is_valid(raise_exception=True)
self.perform_create(serializer)
headers = self.get_success_headers(serializer.data)
return Response(serializer.data, status=status.HTTP_201_CREATED, headers=headers)
Installation
pip install coverage
Running and erasing coverage For running on a django project, use this command:
coverage run --source='.' manage.py test the-app-you-want-to-test
This command will fill a “.coverage”, located in COVERAGE_FILE and then you may see results or report. If you need to remove gathered data, execute:
coverage erase
For a single file Maybe you only want to check a python code, then do:
coverage run your_program.py arg1 arg2 arg3
There are some additional options, take a look on https://coverage.readthedocs.io/en/coverage-4.3.4/cmd.html#execution
See results If you want to show the results in the command line, run:
coverage report
For more readable results run:
coverage html
To know concretely what part of your code is covered by tests, use:
coverage annotate -d directory-where-to-put-annotated-files
It will generate same source code file with an additional syntax on it:
Good coverage level A good coverage usually comes on 90%. However, if you see 100% it could be not so good signal because it could >be someone dealing with coverage instead of quality of tests.
Some tips:
Here we will create a New django project inside a docker image. Can be extended to use with an existing project. Details will vbe added and further.
Create a requirements.txt file with the following content: ! this is the minimum architecture to run the project. We need Django and psycopg2
Django>=3.0,<4.0
psycopg2-binary>=2.8
Create a Dockerfile with the following content:
# syntax=docker/dockerfile:1
FROM python:3
ENV PYTHONUNBUFFERED=1
WORKDIR /code
COPY requirements.txt /code/
RUN pip install -r requirements.txt
COPY . /code/
Create a docker-compose.yml file:
Here you set Services: db:
web:
version: "3.9"
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
From the root of the directory:
Create the Django project by running the docker-compose run command as follows.
sudo docker-compose run web django-admin startproject composeexample .
After the docker-compose command completes, list the contents of your project.
ls -l
ls
ls -l
ls -la
cd
cd..
pwd
mkdir
rm -r
rm -f
cp
mv
touch file
cat [file]
cat > [file]
cat >> [file]
tail -f [file]
ping [host]
route -n
iptables -L
netstat -a
whois [domain]
dig [domain]
dig -x [host]
wget [file]
wget -r [url]
curl [url]
ssh user@host
ssh -p [port] user@host
ssh -D user@host
ps
ps -aux
kill [pid]
killall proc
date
uptime
whoami
w
Display cpu info
cat /proc/cpuinfo
cat /proc/meminfo
free
du
du -sh
df
uname -a
tar -cf [file.tar] [files]
tar -xf [file.tar]
Show contents of archive Options :
chmod [rights] [file]
4 - read(r)
2 - write(w)
1 - execute(x)
order : owner / group / world
chmod 777
- rwx for everyone
chmod 755
- rw for owner, rx for group and world
grep '[pattern]' [files]
grep -r '[pattern]' dir
locate [file]
whereis [app]
Found this image that is a good starting point for data science projects: You need docker allready installed on the machine.
Installation link and instructions can be found here:
https://docs.docker.com/get-docker/
https://hub.docker.com/r/civisanalytics/datascience-python
docker pull civisanalytics/datascience-python
docker run -p 8888:8888 -i -t civisanalytics/datascience-python:latest /bin/bash
In the container, you start the jupyter notebook server.jupyter notebook --ip 0.0.0.0 --no-browser --allow-root
Access the notebook server at http://localhost:8888/ The exact link including the token is shown in the terminal.
!To add a folder to the notebook server you can use the following command:
docker run -p 8888:8888 -i -t -v /Users/elvismunteanu/elvism/DataScience:/elvism civisanalytics/datascience-python:latest /bin/bash
The command added to second step is -v <Your folder location>:<Folder name and location from the Docker container to pass it in>
docker container create [image]
docker create [image] [command]
Use same options and arguments fordocker run
Image is downloaded if it is not present and Filesystem is created Containers can be started later withdocker start
docker start [container]
docker container start [container]
Use-a
to attach to the container STDOUT and STDERR Use-i
to attach to the container STDIN (standard input)
docker inspect [container]
Prints out the container's configuration
--name [name]
Use --name to name the container. If not specified, the container will be named after the image. Default Names have a form:_
docker run -it python:3 bash
Use -it or --interactive --tty option to create an interactive container. Can be stopped typing
exit
-p 8080:80
-p <host port>:<container port>
-p
or --publish to declare port mapping-p
multiple times for several Port Mappings-P
to publish all Containers Ports declared in Image Metadata to random Host Ports from upper RangeCommands examples:
$ docker run -it -p 8080:80 -p 8443:443 nginx
$ docker run -it -p 8000:80 -p 8443:443 nginx
$ docker run -it -p 8888:8888 jupyter/datascience-notebook
docker kill $(docker ps -q)
docker rm $(docker ps -a -q)
docker rmi $(docker images -q)
-v ${PWD}:/app
-v <host_folder(Source Path)>:<container_folder(Mount Point)>
Host path must be absolute path On windows must start with drive location (e.g. C:/Users/elvis/elvism/DataScience)
- add :ro at the end to enable read only access
docker run -t -v ${PWD}:/app python:3 bash
-v
or --volume to mount a host directory inside a container-v
multiple times for several Bind MountsAlternative option:
--mount type=bind,source=${PWD},target=/app
It uses the same syntax as the-v
option, but more verbose.
display a list of running containers
docker ps
docker container ls
Use -a or -all to display all containers, including stopped ones and paused
-d
or--detach
docker container attach <container>
docker attach <container>
Use Inspect Command to get detailed Metadata information about the image Use History Command to ee how the Image was built
-e <key>=<value>
--env-file <file>
- read variables from file in Docker host
docker container exec <container> <command>
docker exec <container> <command>
docker exec -it <container> bash
Use -it for interactive mode Use -w to change working directory Use-e <VAR>=<value>
to pass environment variables
docker pull <image>
docker image pull <image>
Use:latest
at the end to pull the latest version of the image Use -a to pull all tags of the image
docker rename <old_image> <new_image>
remove image name withdocker rmi <image>
docker container logs <container_name>
docker logs <container>
docker logs -f <container>
- print in real time, press CTRL+C to stop/ It stops the command not the container Use -t to print timestamps Use --since to print logs since a specific timeframe
docker image inspect <image>
docker inspect <image>
docker image history <image>
Components of a Docker Image:
Image Build Process:
FROM <base_image>
INSTRUCTION arguments
more arguments``` `INSTRUCTION arguments # comment`
FROM <base_image>
COPY <from> <to>
RUN <command>
WORKDIR <path>
ENV <key> <value>
VOLUME <path>
EXPOSE <port>
CMD <command>
ENTRYPOINT <command>
docker build -t <image_name> .
the dot represent the current folder containing the Dockerfile
This is usefull when the base arguments are needed to be changed but we do not wanna change them in the Dockerfile ARG base=python ARG tag=3 FROM $base:$tag -> python:3.9 example:
docker build -t myimage:3.9 --build-arg tag=3.9 .
COPY <from> <to>
COPY <from> <from> <from> <destination\>
To copy multiple files or folders the last argument must be a directory
COPY --chown=<user>:<group> <from> <to>
user must exist in Image's /etc/passwd group must exist in Image's /etc/groupexample:
COPY --chown=elvis:elvis /Users/elvis/elvism/DataScience/ /app/
COPY --chown=1000:55 src dst
Run can be used to install and to start a command in the container
RUN <command>
RUN <shell command line>
- Default Shell is /bin/sh
- No background Processes are allowed!
- Only non-interactive commands are allowed!
RUN examples:
RUN pip install -r requirements.txt
RUN pip install django==3.2.6
RUN conda install -y pandas
RUN python manage.py migrate
RUN useradd elvis && chown -R elvis:elvis /app > /tmp/logfile
VOLUME <path>
Declares a volume mount point at directoryDirectory is created if it does not exist Why use VOLUME?
- To share data between containers
- Data persistence - they are independent of containers
- High Performance
List of Volume Mount Points
docker volume ls
Create a Volume Mount Pointdocker volume create <name>
Remove a Volume Mount Pointdocker volume rm <name>
docker login
docker tag <image> <registry>/<image>:<tag>
docker push <registry>/<image>:<tag>
Docker name must include Docker Hub Account name or namespace
Other docker registries can be used to push images to. You can use any registry that supports the Docker Registry API.
UPDATE table SET col1 = 0 WHERE col1 IS NULL;
Docker