ryanwong215 / quantitative

0 stars 0 forks source link

Profile not set #2

Open ryanwong215 opened 5 days ago

ryanwong215 commented 5 days ago

PowerShell Script to Modify the Profile to Prepend User PATH

param ( [Parameter(Mandatory = $true)] [string]$PrependPath )

Step 1: Validate the directory

if (!(Test-Path -Path $PrependPath)) { Write-Error "Error: The provided directory '$PrependPath' does not exist." exit 1 }

Step 2: Get the PowerShell profile path

$profilePath = $PROFILE

Step 3: Ensure the profile file exists

if (!(Test-Path -Path $profilePath)) { Write-Output "Profile file does not exist. Creating it at: $profilePath" New-Item -ItemType File -Path $profilePath -Force }

Step 4: Update the profile to prepend the path to the user's PATH variable

$profileContent = @"

Prepend a custom path to the PATH environment variable

\$userPath = [System.Environment]::GetEnvironmentVariable("PATH", [System.EnvironmentVariableTarget]::User) if (-not \$userPath.Contains("$PrependPath")) { \$env:PATH = "$PrependPath;\$env:PATH" } "@

Write-Output "Adding the path modification logic to the PowerShell profile..." Add-Content -Path $profilePath -Value $profileContent -Force

Step 5: Reload the profile for the current session

Write-Output "Reloading the PowerShell profile for this session..." . $PROFILE

Step 6: Confirm the PATH is updated

Write-Output "Updated PATH:" $env:PATH

ryanwong215 commented 4 days ago

`# Define paths $JDKPath = "C:\Program Files\Java\jdk-17" # Update with your JDK path $PyCharmPath = "C:\Program Files\JetBrains\PyCharm 2023.2" # Update with your PyCharm path $VMOptionsFile = "$PyCharmPath\bin\pycharm64.vmoptions"

Check if JDK path exists

if (-Not (Test-Path $JDKPath)) { Write-Host "JDK path does not exist: $JDKPath" -ForegroundColor Red exit 1 }

Check if PyCharm path exists

if (-Not (Test-Path $PyCharmPath)) { Write-Host "PyCharm path does not exist: $PyCharmPath" -ForegroundColor Red exit 1 }

Set JAVA_HOME as an environment variable

Write-Host "Set JAVA_HOME to: $JDKPath"

Backup the existing VM options file if not already backed up

$BackupFile = "$VMOptionsFile.bak" if (Test-Path $VMOptionsFile -and -Not (Test-Path $BackupFile)) { Copy-Item $VMOptionsFile $BackupFile Write-Host "Backed up VM options file to: $BackupFile" }

Update the VM options file

@" -Djava.home=$JDKPath -Xmx2g "@ | Set-Content -Path $VMOptionsFile -Encoding UTF8

Write-Host "Updated PyCharm VM options at: $VMOptionsFile"

Verify the configuration

Write-Host "JAVA_HOME is now: $([System.Environment]::GetEnvironmentVariable('JAVA_HOME', [System.EnvironmentVariableTarget]::Machine))" Write-Host "Setup completed. Please restart PyCharm to apply the changes."`

ryanwong215 commented 3 days ago

Define paths

$PycharmPath = "C:\Path\To\PyCharm\bin\pycharm64.exe" # Update with PyCharm executable path $JavaPath = "C:\Path\To\Java" # Update with JAVA_HOME path $PythonPath = "C:\Path\To\Python" # Update with Python installation path $DownloadDir = "$env:USERPROFILE\Downloads\setup_pycharm_spark" # Directory for downloads $SparkURL = "https://downloads.apache.org/spark/spark-3.3.0/spark-3.3.0-bin-hadoop3.tgz" $HadoopWinutilsURL = "https://github.com/steveloughran/winutils/raw/master/hadoop-2.6.0/bin/winutils.exe" $ShortcutPath = "$env:USERPROFILE\Desktop\PyCharm_Spark.lnk"

Ensure download directory exists

if (-Not (Test-Path $DownloadDir)) { New-Item -ItemType Directory -Path $DownloadDir }

Download and extract Spark

$SparkTGZ = "$DownloadDir\spark-3.3.0-bin-hadoop3.tgz" $SparkExtractDir = "$DownloadDir\spark-3.3.0-bin-hadoop3" $SPARK_HOME = "$env:USERPROFILE\spark"

if (-Not (Test-Path $SPARK_HOME)) { Write-Host "Downloading Spark..." Invoke-WebRequest -Uri $SparkURL -OutFile $SparkTGZ

Write-Host "Extracting Spark..."
tar -xf $SparkTGZ -C $DownloadDir
Rename-Item -Path "$SparkExtractDir" -NewName $SPARK_HOME
Write-Host "Spark set up at: $SPARK_HOME"

} else { Write-Host "Spark already set up at: $SPARK_HOME" }

Download and extract Hadoop winutils

$HadoopBinDir = "$env:USERPROFILE\hadoop\bin" $HadoopWinutils = "$HadoopBinDir\winutils.exe"

if (-Not (Test-Path $HadoopWinutils)) { Write-Host "Downloading Hadoop winutils..." New-Item -ItemType Directory -Path $HadoopBinDir -Force Invoke-WebRequest -Uri $HadoopWinutilsURL -OutFile $HadoopWinutils Write-Host "Hadoop winutils set up at: $HadoopBinDir" } else { Write-Host "Hadoop winutils already set up at: $HadoopBinDir" }

Create the batch file

$BatchFilePath = "$DownloadDir\launch_pycharm.bat" Set-Content -Path $BatchFilePath -Value @" @echo off :: Set environment variables set JAVA_HOME=$JavaPath set PATH=%JAVA_HOME%\bin;%PATH%

set PYTHON_HOME=$PythonPath set PATH=%PYTHON_HOME%\Scripts;%PYTHON_HOME%;%PATH%

set SPARK_HOME=$SPARK_HOME set PATH=%SPARK_HOME%\bin;%PATH%

set HADOOP_HOME=$env:USERPROFILE\hadoop set PATH=%HADOOP_HOME%\bin;%PATH%

:: Launch PyCharm start "" "$PycharmPath" "@ Write-Host "Batch file created at: $BatchFilePath"

Create the VBS file

$VbsFilePath = "$DownloadDir\launch_pycharm.vbs" Set-Content -Path $VbsFilePath -Value @" Set WshShell = CreateObject("WScript.Shell") WshShell.Run ""cmd.exe /c $BatchFilePath"", 0 "@ Write-Host "VBS file created at: $VbsFilePath"

Create the shortcut

$WshShell = New-Object -ComObject WScript.Shell $Shortcut = $WshShell.CreateShortcut($ShortcutPath) $Shortcut.TargetPath = $VbsFilePath $Shortcut.WorkingDirectory = (Get-Item $BatchFilePath).DirectoryName $Shortcut.IconLocation = $PycharmPath $Shortcut.Save() Write-Host "Shortcut created on Desktop with PyCharm icon."

Write-Host "Setup completed. Use the shortcut on your Desktop to launch PyCharm with Spark and Hadoop configured."

ryanwong215 commented 2 days ago

Focused Usage Trends of Bare Metal, VMs, and Containers (2000s-2025) Estimated Number of Computers Over Time (1960s-2020s) Estimated Number of Applications:Services Over Time (1960s-2020s)

ryanwong215 commented 2 days ago

Usage Trends of Bare Metal, VMs, and Containers (2000-2024)

ryanwong215 commented 2 days ago
  1. Virtualization Adoption: • A 2020 report by Spiceworks indicated that application virtualization was expected to grow from 39% to 56% by 2021. (Spiceworks)
    1. Container Adoption: • A 2021 survey by Statista reported that 96% of organizations were using container technology in development and test environments. (Statista) • The Cloud Native Computing Foundation’s 2021 survey highlighted a 37% year-over-year increase in Kubernetes adoption. (CNCF)
    2. Market Trends: • The global virtual machine market was valued at approximately USD 9.79 billion in 2023, with projections to reach USD 61.61 billion by 2036, indicating sustained growth in VM adoption. (Research Nester) • The container technology market has seen significant growth due to benefits like simplified deployment and scalability. (Statista)

These sources provide insights into the adoption trends of VMs and containers over the past two decades.

ryanwong215 commented 2 days ago

ELK (Elasticsearch, Logstash, Kibana) and Grafana/Prometheus are both powerful tools for observability, but they serve different purposes and excel in different areas. Here’s a breakdown:

ELK (Elasticsearch, Logstash, Kibana):

Primary Use: Centralized logging, search, and log analysis.

1.  Elasticsearch:
•   Stores and indexes data, primarily logs and structured data.
•   Provides fast search and filtering capabilities.
2.  Logstash:
•   Ingests, parses, and transforms logs or other data sources.
•   Allows complex pipeline configurations for data enrichment.
3.  Kibana:
•   A visualization tool for Elasticsearch data.
•   Enables building dashboards for log analysis, system metrics, and more.

Key Strengths:

•   Centralized logging: Aggregates logs from multiple systems for easy access.
•   Advanced log analysis: Search through logs with powerful queries.
•   Visualization: Create dashboards and perform root-cause analysis using log data.
•   Scalability: Handles large-scale log ingestion and analysis.

Common Use Cases:

•   Debugging application errors by analyzing logs.
•   Monitoring logs for specific events or patterns (e.g., security incidents).
•   Business intelligence from log data (e.g., user activity trends).

Grafana / Prometheus:

Primary Use: Monitoring and alerting based on time-series metrics.

1.  Prometheus:
•   A time-series database optimized for metrics collection and query.
•   Pulls metrics data from configured endpoints via HTTP (scraping).
•   Includes built-in alerting with Alertmanager.
2.  Grafana:
•   A visualization and dashboarding tool.
•   Can integrate with various data sources, including Prometheus, Elasticsearch, MySQL, etc.
•   Provides customizable dashboards for real-time monitoring.

Key Strengths:

•   Metrics collection: Handles high-frequency metrics (e.g., CPU usage, memory consumption).
•   Real-time monitoring: Visualize current system state.
•   Alerts: Trigger alerts based on metric thresholds or anomalies.
•   Multi-source integration: Combine metrics from multiple systems into a single view.

Common Use Cases:

•   Infrastructure monitoring (CPU, memory, disk, network).
•   Application performance monitoring (APM) using custom metrics.
•   Real-time system health dashboards.
•   Alerting for infrastructure and application anomalies.

Key Differences:

Feature ELK Grafana / Prometheus Focus Logs and unstructured data Metrics and time-series data Data Type Text-heavy (logs, events) Numeric (metrics) Ingestion Push-based (Logstash, Beats) Pull-based (Prometheus scrapes targets) Storage Elasticsearch (log storage) Prometheus (time-series DB) Visualization Kibana (log-oriented dashboards) Grafana (real-time metrics dashboards) Alerting Limited (via Watcher, custom plugins) Strong (via Prometheus Alertmanager) Complexity Higher (ELK setup and scaling) Lower (Grafana + Prometheus is simpler)

When to Use ELK vs Grafana/Prometheus:

•   Use ELK when:
•   Your primary need is centralized logging.
•   You want to perform detailed log analysis or debugging.
•   You need full-text search capabilities across large volumes of logs.
•   Use Grafana/Prometheus when:
•   Your primary need is real-time monitoring.
•   You are focusing on time-series metrics (e.g., CPU, memory, API response time).
•   You need robust alerting and trend visualization.

Combining ELK and Grafana/Prometheus:

Many organizations use both tools together:

•   Use ELK for log aggregation and debugging.
•   Use Grafana/Prometheus for monitoring system health and alerting.
•   Grafana can integrate with Elasticsearch, allowing you to visualize logs alongside metrics.

This combined approach gives full-stack observability: metrics for performance monitoring, logs for root-cause analysis, and unified dashboards for operational insights.