Closed sudhirkshirsagar closed 3 years ago
Note that the smtc_db_init image derives from the smtc_common image.
The image dependency does not resolve script dependencies. In the same image, the scripts need to have references to the correct directories for the Python script imports to work. How does one modify the provisioning after deployment when the images are built?
On Fri, Mar 19, 2021 at 9:05 PM xwu2 @.***> wrote:
Note that the smtc_db_init image derives from the smtc_common image.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/OpenVisualCloud/Smart-City-Sample/issues/754#issuecomment-803241056, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD52ZGF73N7M7PPMJ4ZOPTLTEQNIRANCNFSM4ZP37P3A .
What script dependency is missing?
BTW, provision.py is just an example of initial provisioning. In a real product, a field provisioning tool must be written to dynamically insert provisioning information into the DB. This field tool is out of scope of the sample. Thus provision.py is there to write something to the DB so that the sample can function.
from db_ingest import DBIngest is one example of the dependencies. That script is in common. Although a full reprovisioning may be out of the scope, the sample appears to have a very specific one-shot architecture. Also the sensors from sensor-info.json don't seem to get provisioned. It is not clear how to get them provisioned in the initial provisioning.
On Sat, Mar 20, 2021 at 2:26 PM xwu2 @.***> wrote:
What script dependency is missing?
BTW, provision.py is just an example of initial provisioning. In a real product, a field provisioning tool must be written to dynamically insert provisioning information into the DB. This field tool is out of scope of the sample. Thus provision.py is there to write something to the DB so that the sample can function.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/OpenVisualCloud/Smart-City-Sample/issues/754#issuecomment-803465603, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD52ZGFM6XMCIXB3N4LB2G3TEUHHLANCNFSM4ZP37P3A .
Yes, db_ingest import DBIngest, which is in the smtc_common. This is why the db_init image uses smtc_common as the base image. The final built image smtc_db_init is self-sufficient with all the dependencies. In other images where DBIngest is needed, they either use smtc_common as the base image or copy the *.py scripts over. What was the issue exactly?
As for provisioning, the camera info from sensor-info.json is written to the provisions index. Then at runtime, when a new sensor is discovered, the IP address or URL is compared to the provisions index to find a match. Then the discover sensor is written to the sensors index. See sensor/discovery for details.
Python scripts can't reference scripts in other directories. They need to have exact reference to those directories. Copying the required scripts is an option but not the best option. If following the tutorial steps does not work a single Xeon server where all the requirements have been installed then I suspect that there are dependencies that are not being addressed in the published version, or perhaps a single server install has never been tested in spite of what the tutorial says. The master builds but the v21.3 build fails. So that is another data point that the architecture is not fully tested for a single server docker swarm install.
On Sun, Mar 21, 2021 at 9:36 AM xwu2 @.***> wrote:
Yes, db_ingest import DBIngest, which is in the smtc_common. This is why the db_init image uses smtc_common as the base image. The final built image smtc_db_init is self-sufficient with all the dependencies. In other images where DBIngest is needed, they either use smtc_common as the base image or copy the *.py scripts over. What was the issue exactly?
As for provisioning, the camera info from sensor-info.json is written to the provisions index. Then at runtime, when a new sensor is discovered, the IP address or URL is compared to the provisions index to find a match. Then the discover sensor is written to the sensors index. See sensor/discovery for details.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/OpenVisualCloud/Smart-City-Sample/issues/754#issuecomment-803617840, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD52ZGGUMVTI3PA5K5SQERTTEYOCLANCNFSM4ZP37P3A .
Don't fully understand what you say. All python scripts are under /home. No issue with execution. Single node docker swarm is tested. The v21.3 branch does not build simply because the required openvisualcloud software stacks have not yet released and uploaded to dockerhub.
What image are you referring to when you run provision.py in a built image? Obviously the testing did not follow the tutorial method for a single node/server swarm on Ubuntu 18.04. Everything builds, all services run but you get errors in the services!! A branch should be able to be tested and run, otherwise there should be comments in the readme. Maybe there are. Rebuilding the analytics did not do anything.
On Mon, Mar 22, 2021 at 9:20 AM xwu2 @.***> wrote:
Don't fully understand what you say. All python scripts are under /home. No issue with execution. Single node docker swarm is tested. The v21.3 branch does not build simply because the required openvisualcloud software stacks have not yet released and uploaded to dockerhub.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/OpenVisualCloud/Smart-City-Sample/issues/754#issuecomment-804200747, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD52ZGCS5HADUUODCS6222TTE5U35ANCNFSM4ZP37P3A .
Let's use the smtc_db_init image as an example. You can run into the image to talk a look:
docker run --rm -it smtc_db_init ls -l /home
You can see that all the python scripts and their dependencies are under /home.
The smtc_db_init image shows the scripts in the home but clearly sensors from the json are not being provisioned. The env also shows lat long for the office variable as shown in the following, and the provisioning script has no comments so impossible to follow. Also it has variables like office1. bash-4.2$ printenv HOSTNAME=498cc4b2b5c4 TERM=xterm REPLICAS=0,0 NO_PROXY= PYTHONIOENCODING=UTF-8 DBHOST=http://db:9200 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OFFICE=45.539626,-122.929569 GWHOST=http://traffic_office1_gateway:8080 PWD=/home SHLVL=1 HOME=/home SCENARIO=traffic no_proxy= DBCHOST=http://cloud_gateway:8080/cloud/api/db _=/usr/bin/printenv OLDPWD=/ bash-4.2$
dbhost=env["DBHOST"] office=list(map(float,env["OFFICE"].split(","))) gwhost=env.get("GWHOST",None) scenario=env["SCENARIO"]
def Provision(officestr): print("Provisioning...", flush=True)
# populate db with simulated offices and provisionings
with open("/run/secrets/sensor-info.json",encoding='utf-8') as fd:
data=json.load(fd)
dbp=DBIngest(index="provisions", office=office, host=dbhost)
for office1 in data:
if scenario != office1["scenario"]: continue
location1=office1["location"]
if location1["lat"]!=office[0] or location1["lon"]!=office[1]:
continue
sensors=office1.pop("sensors")
for s in sensors:
s["office"]=location1
if "ip" in s: # convert IP to CIDR
if s["ip"].find("/")<0:
s["ip"]=s["ip"]+"/32"
s["ip_text"]=s["ip"] # dup for terms aggs
dbp.ingest_bulk(sensors, refresh="wait_for")
office1.pop("scenario")
On Mon, Mar 22, 2021 at 10:48 AM xwu2 @.***> wrote:
Let's use the smtc_db_init image as an example. You can run into the image to talk a look:
docker run --rm -it smtc_db_init ls -l /home
You can see that all the python scripts and their dependencies are under /home.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/OpenVisualCloud/Smart-City-Sample/issues/754#issuecomment-804266903, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD52ZGHJ2KXLGVXF27G4VVDTE57INANCNFSM4ZP37P3A .
Please check deployment/docker-swarm/secret.m4 and deployment/docker-swarm/analytics.m4. The sensor-info.json is mounted to the image vs docker secrets.
Here are the files. These are all created by the standard build as defined in the tutorial.
secrets: self_key: file: ../certificate/self.key self_crt: file: ../certificate/self.crt sensor_info: file: ../../maintenance/db-init/sensor-info.json
include(platform.m4)
ifelse(defn(SCENARIO_NAME'),
traffic', defn(
OFFICE_NAME')_analytics_traffic:
image:
PLATFORM_IMAGE(defn(REGISTRY_PREFIX')
smtc_analyticsobject'defn(PLATFORM_SUFFIX')
_'defn(`FRAMEWORK'):latest)
volumes:
OFFICE_LOCATION')" PLATFORM_ENV(DBHOST): "http://ifelse (defn(
NOFFICES'),1,db,defn(OFFICE_NAME')_db):9200" PLATFORM_ENV(MQTTHOST): "defn(
OFFICE_NAME')_mqtt"
PLATFORM_ENV(MQTT_TOPIC):
"ifelse(defn(OT_TYPE'),
false',analytics,relayanalytics)"
PLATFORM_ENV(EVERY_NTH_FRAME): 6
PLATFORM_ENV(SCENARIO''): "defn(`SCENARIO_NAME')" PLATFORM_ENV(STHOST): "http://defn (`OFFICE_NAME')_storage:8080/api/upload" PLATFORM_ENV(PIPELINE_VERSION): 2 PLATFORM_ENV(
NETWORK_PREFERENCE''):
"{\"defn(PLATFORM_DEVICE')\":\"defn(
NETWORK_PREFERENCE')\"}"
PLATFORM_ENV(NO_PROXY): ""
PLATFORM_ENV(no_proxy): ""
PLATFORM_ENV_EXTRA()dnl
networks:ifelse(defn(SCENARIO_NAME'),
stadium', defn(
OFFICE_NAME')_analytics_entrance:
image:
PLATFORM_IMAGE(defn(REGISTRY_PREFIX')
smtc_analyticsentrance'defn(PLATFORM_SUFFIX')
_'defn(`FRAMEWORK'):latest)
volumes:
OFFICE_LOCATION')" PLATFORM_ENV(DBHOST): "http://ifelse (defn(
NOFFICES'),1,db,defn(OFFICE_NAME')_db):9200" PLATFORM_ENV(MQTTHOST): "defn(
OFFICE_NAME')_mqtt"
PLATFORM_ENV(MQTT_TOPIC):
"ifelse(defn(OT_TYPE'),
false',analytics,relayanalytics)"
PLATFORM_ENV(EVERY_NTH_FRAME): 6
PLATFORM_ENV(SCENARIO''): "defn(`SCENARIO_NAME')" PLATFORM_ENV(STHOST): "http://defn (`OFFICE_NAME')_storage:8080/api/upload" PLATFORM_ENV(
NETWORK_PREFERENCE''):
"{\"defn(PLATFORM_DEVICE')\":\"defn(
NETWORK_PREFERENCE')\"}"
PLATFORM_ENV(NO_PROXY): ""
PLATFORM_ENV(no_proxy): ""
PLATFORM_ENV_EXTRA()dnl
networks:appnet deploy: replicas: defn(`NANALYTICS3') placement: constraints:
defn(OFFICE_NAME')_analytics_crowd: image: PLATFORM_IMAGE(defn(
REGISTRY_PREFIX')smtc_analytics_crowd_'defn(
PLATFORMSUFFIX')`'defn(`FRAMEWORK'):latest)
volumes:
OFFICE_LOCATION')" PLATFORM_ENV(DBHOST): "http://ifelse (defn(
NOFFICES'),1,db,defn(OFFICE_NAME')_db):9200" PLATFORM_ENV(MQTTHOST): "defn(
OFFICE_NAME')_mqtt"
PLATFORM_ENV(MQTT_TOPIC): "analytics)"
PLATFORM_ENV(EVERY_NTH_FRAME): "6"
PLATFORM_ENV(SCENARIO''): "defn(`SCENARIO_NAME')" PLATFORM_ENV(STHOST): "http://defn (`OFFICE_NAME')_storage:8080/api/upload" PLATFORM_ENV(
NETWORK_PREFERENCE''):
"{\"defn(PLATFORM_DEVICE')\":\"defn(
NETWORK_PREFERENCE')\"}"
PLATFORM_ENV(NO_PROXY): ""
PLATFORM_ENV(no_proxy): ""
PLATFORM_ENV_EXTRA()dnl
networks:appnet deploy: replicas: defn(`NANALYTICS2') placement: constraints:
defn(OFFICE_NAME')_analytics_svcq: image: PLATFORM_IMAGE(defn(
REGISTRY_PREFIX')smtc_analytics_object_'defn(
PLATFORMSUFFIX')`'defn(`FRAMEWORK'):latest)
volumes:
OFFICE_LOCATION')" PLATFORM_ENV(DBHOST): "http://ifelse (defn(
NOFFICES'),1,db,defn(OFFICE_NAME')_db):9200" PLATFORM_ENV(MQTTHOST): "defn(
OFFICE_NAME')_mqtt"
PLATFORM_ENV(MQTT_TOPIC):
"ifelse(defn(OT_TYPE'),
false',analytics,relayanalytics)"
PLATFORM_ENV(EVERY_NTH_FRAME): 6
PLATFORM_ENV(SCENARIO''): "defn(`SCENARIO_NAME')" PLATFORM_ENV(STHOST): "http://defn (`OFFICE_NAME')_storage:8080/api/upload" PLATFORM_ENV(PIPELINE_VERSION): 2 PLATFORM_ENV(
NETWORK_PREFERENCE''):
"{\"defn(PLATFORM_DEVICE')\":\"defn(
NETWORK_PREFERENCE')\"}"
PLATFORM_ENV(NO_PROXY): ""
PLATFORM_ENV(no_proxy): ""
PLATFORM_ENV_EXTRA()dnl
networks:On Mon, Mar 22, 2021 at 1:34 PM xwu2 @.***> wrote:
Please check deployment/docker-swarm/secret.m4 and deployment/docker-swarm/analytics.m4. The sensor-info.json is mounted to the image vs docker secrets.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/OpenVisualCloud/Smart-City-Sample/issues/754#issuecomment-804375645, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD52ZGCEBKYSAC2D73UHOB3TE6SXDANCNFSM4ZP37P3A .
It is not possible to run provision.py in its directory because it references other py files in other directories.