Closed aleeusgr closed 2 months ago
x 3.5k impressions,
Factors: pic, "mindful developer community", 7 AM UTC
![image](https://github.com/aleeusgr/nix-things/assets/36756030/6a49106f-cbf4-49e6-83b0-a8f817adb429)
![image](https://github.com/aleeusgr/nix-things/assets/36756030/bc2f60e2-43cd-4269-9cc2-7f5b7204192a)
![image](https://github.com/aleeusgr/nix-things/assets/36756030/4dc68796-7874-4d3e-b50b-97a025fd989c)
![image](https://github.com/aleeusgr/nix-things/assets/36756030/8083e48f-5003-4ad7-8993-4574c53257ad)
A mindful developer community is a group of software developers and technologists who prioritize mental well-being, emotional intelligence, and sustainable practices in their work environment and professional interactions. This community values the importance of self-awareness, empathy, collaboration, and continuous learning to create an inclusive, supportive atmosphere that fosters personal growth and innovation.
text
index e98cf1a..4aa24b1 100644
--- a/flake.nix
+++ b/flake.nix
@@ -3,11 +3,17 @@
inputs = {
nixpkgs.url = "github:nixos/nixpkgs/nixos-unstable";
- home-manager.url = "github:nix-community/home-manager";
- home-manager.inputs.nixpkgs.follows = "nixpkgs";
+ home-manager = {
+ url = "github:nix-community/home-manager";
+ inputs.nixpkgs.follows = "nixpkgs";
+ };
+ codeium = {
+ url = "github:jcdickinson/codeium.nvim";
+ inputs.nixpkgs.follows = "nixpkgs";
+ };
};
- outputs = inputs@{ nixpkgs, home-manager, ... }: {
+ outputs = inputs@{ nixpkgs, home-manager, codeium, ... }: {
nixosConfigurations = {
vivo = nixpkgs.lib.nixosSystem {
system = "x86_64-linux";
+++ b/home.nix
@@ -40,6 +40,16 @@ let
#ihaskell-blaze
#ghcup
];
+ nvimWithCodeium = neovim.override {
+ configure = {
+ customRC = ''
+ lua require("codeium").setup()
+ '';
+ packages.myPlugins = {
+ start = [vimPlugins.codeium-nvim vimPlugins.plenary-nvim vimPlugins.nvim-cmp];
+ };
+ };
+ };
in
{
dconf = {
@@ -72,6 +82,7 @@ in
# Neovim is starting to get huge; should move to a separate file.
neovim = {
enable = true;
+ package = nvimWithCodeium;
add llm-ls: https://github.com/roman3pm/manix/blob/9cdee1f93ab2de92bf557945400c5546ac082e06/profiles/home.nix#L53
add llm-nvim: https://github.com/roman3pm/manix/blob/9cdee1f93ab2de92bf557945400c5546ac082e06/profiles/neovim/default.nix#L54-L60
add lua config: https://github.com/roman3pm/manix/blob/9cdee1f93ab2de92bf557945400c5546ac082e06/profiles/neovim/lsp.lua#L196-L205
askai.sh improvements.
Add option to read arbitrary text file or input.
Make accessible globally via home manager.
Try Continue + VSCode Setup guide: https://docs.continue.dev/walkthroughs/set-up-codestral?ref=blog.continue.dev
Running Continue without Internet: https://docs.continue.dev/walkthroughs/running-continue-without-internet
VS Code plugin: https://marketplace.visualstudio.com/items?itemName=Continue.continue
Codestral keys: https://console.mistral.ai/codestralcontinune
Suggested local models for Chat, Autocomplete, and Embeddings: https://docs.continue.dev/setup/select-model
**Description:** As a developer on NixOS, I'd like to integrate an AI coding assistant with Neovim to enhance my coding experience. This assistant should provide real-time suggestions, code completion, and debugging assistance to help me write more efficient and accurate code. **Current Behavior:** Currently, I'm using Neovim as my editor on NixOS, but I'm missing the benefits of an AI-powered coding assistant. I've tried installing various plugins and extensions, but none of them provide the level of AI-driven assistance I need. **Desired Behavior:** I'd like to see an AI coding assistant integrated with Neovim on NixOS that provides the following features: 1. **Real-time suggestions:** As I type, the assistant should suggest relevant code snippets, functions, and variables to help me complete my code. 2. **Code completion:** The assistant should auto-complete code snippets, functions, and variables as I type, reducing the need for manual typing. 3. **Debugging assistance:** The assistant should provide real-time debugging information, such as variable values, function calls, and error messages, to help me identify and fix issues. 4. **Code analysis:** The assistant should analyze my code and provide suggestions for improvement, such as code refactoring, optimization, and best practices. **Technical Requirements:** To implement this feature, I'd like to see the following technical requirements met: 1. **Integration with Neovim:** The AI coding assistant should seamlessly integrate with Neovim, allowing me to access its features and functionality. 2. **NixOS compatibility:** The solution should be compatible with NixOS and its package manager, Nix. 3. **Language support:** The assistant should support multiple programming languages, including Python, Java, and C++. 4. **Customizability:** The assistant should allow for customization of its behavior, such as adjusting the level of suggestions, code completion, and debugging assistance. **Acceptance Criteria:** To confirm that the AI coding assistant is working correctly, the following acceptance criteria should be met: 1. The assistant provides real-time suggestions and code completion as I type. 2. The assistant provides accurate debugging information and code analysis. 3. The assistant is customizable and integrates seamlessly with Neovim. 4. The solution is compatible with NixOS and its package manager, Nix. **Estimated Time:** 2-3 weeks
obsidian-ollama-chat is a plugin displays the prompt input in Obsidian.
press Ctrl + P
in obsidian and find Ollama Chat
https://github.com/brumik/ollama-obsidian-indexer is a backend part for the Obsidian plugin written in python flask
run as a binary built by nix
The plugin uses localhost:5000
to talk to the indexer
The indexer is a wrapper around ollama
fix by reading the indexer readme: there is a .env file that needs to be in the root.
File "/nix/store/3xn84lvjwcji160gq0vbi4c7g7z62j5g-python3.11-httpx-0.27.0/lib/python3.11/site-packages/httpx/_transports/default.py", line 232, in handle_request
with map_httpcore_exceptions():
File "/nix/store/k3701zl6gmx3la7y4dnflcvf3xfy88kh-python3-3.11.9/lib/python3.11/contextlib.py", line 158, in __exit__
self.gen.throw(typ, value, traceback)
File "/nix/store/3xn84lvjwcji160gq0vbi4c7g7z62j5g-python3.11-httpx-0.27.0/lib/python3.11/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ReadTimeout: timed out
Most likely cos the hardware is weak;
```shell
File "/nix/store/lyblfjr3pdvy15nmb7y8aw035ysgi0ng-python3.11-ollama-obsidian-indexer-0.1.0/lib/python3.11/site-packages/ollama_obsidian_indexer/llm.py", line 104, in index_dir
if any(fname.endswith(".md") for fname in os.listdir(dir_path)):
^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: ''
```
the package is built using `nix build`:
```nix
{
description = "Application packaged using poetry2nix";
inputs = {
flake-utils.url = "github:numtide/flake-utils";
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable-small";
poetry2nix = {
url = "github:nix-community/poetry2nix";
inputs.nixpkgs.follows = "nixpkgs";
};
};
outputs = { self, nixpkgs, flake-utils, poetry2nix }:
flake-utils.lib.eachDefaultSystem (system:
let
# see https://github.com/nix-community/poetry2nix/tree/master#api for more functions and examples.
pkgs = nixpkgs.legacyPackages.${system};
inherit (poetry2nix.lib.mkPoetry2Nix { inherit pkgs; }) mkPoetryApplication;
in
{
packages = {
myapp = mkPoetryApplication { projectDir = self; preferWheels = true; };
default = self.packages.${system}.myapp;
};
# Shell for app dependencies.
#
# nix develop
#
# Use this shell for developing your app.
devShells.default = pkgs.mkShell {
inputsFrom = [ self.packages.${system}.myapp ];
packages = [ pkgs.poetry ];
};
# Shell for poetry.
#
# nix develop .#poetry
#
# Use this shell for changes to pyproject.toml and poetry.lock.
devShells.poetry = pkgs.mkShell {
packages = [ pkgs.poetry ];
};
});
}
```
pyproject.toml:
```python
[tool.poetry]
name = "ollama_obsidian_indexer"
version = "0.1.0"
description = "Summarizes a youtube video from a link using transcript and your local ollama"
authors = ["Levente Berky [INST]
You are a helpful assistant, you will use the provided context to
answer user questions. Read the given context before answering questions
and think step by step. If you can not answer a user question based on
the provided context, inform the user. Do not use any other information
for answering user. Provide a detailed answer to the question.
Context: {context_str}
User: {query_str}
[/INST]
"""
base_url = os.getenv("OLLAMA_BASE_URL", "http://localhost:11434")
model = os.getenv("LLM_MODEL", "mistral")
temperature = float(os.getenv("LLM_TEMPERATURE", 0.1))
prompt_template = os.getenv("LLM_PROMPT_TEMPLATE", prompt_template)
persist_dir = os.getenv("INDEXES_PERSIST_DIR", "./storage")
embed_model_name = os.getenv("OLLAMA_EMBED_MODEL_NAME", "mxbai-embed-large")
ollama_embedding = OllamaEmbedding(
model_name=embed_model_name,
base_url=base_url,
ollama_additional_kwargs={"mirostat": 0},
)
# Set up the llm
Settings.llm = Ollama(base_url=base_url, model=model, temperature=temperature)
prompt = PromptTemplate(template=prompt_template)
# Set up service context with our local llm and embedding
Settings.embed_model = ollama_embedding
# Set up a global index so we do not need to read from memory all the time
index: VectorStoreIndex = None
# Indexer helper
def nodes_to_vector(nodes):
global index
if not exists(join(persist_dir, "index_store.json")):
index = VectorStoreIndex.from_documents(nodes)
index.storage_context.persist(persist_dir=persist_dir)
return "Vector created"
# Now we are sure that we have a dir and files
storage_context = StorageContext.from_defaults(persist_dir=persist_dir)
index = load_index_from_storage(storage_context)
refreshed = index.refresh_ref_docs(
nodes,
# update_kwargs={"delete_kwargs": {"delete_from_docstore": True}}
)
if any(refreshed):
index.storage_context.persist()
return sum(refreshed)
def ensure_index_exists():
global index
if not index:
nodes_to_vector([])
# File reader helper:
def directory_reader_setup(is_dir, path):
required_ext = [".md"]
if is_dir:
return SimpleDirectoryReader(
input_dir=path,
recursive=True,
filename_as_id=True,
required_exts=required_ext,
)
else:
return SimpleDirectoryReader(
input_files=[path], filename_as_id=True, required_exts=required_ext
)
def index_dir(dir_path):
if any(fname.endswith(".md") for fname in os.listdir(dir_path)):
parser = directory_reader_setup(True, dir_path)
return nodes_to_vector(parser.load_data())
else:
# No markdown files in the directory
return -1
def index_file(file_path):
parser = directory_reader_setup(False, file_path)
return nodes_to_vector(parser.load_data())
def delete_index(file_path):
global index
ensure_index_exists()
# Get from path all the part (look like: filepath_part_0)
d = {
k: v
for k, v in index.ref_doc_info.items()
if v.metadata["file_path"] == file_path
}
for key in d:
index.delete_ref_doc(key, delete_from_docstore=True)
index.storage_context.persist()
def query(query):
global index
ensure_index_exists()
retriever = VectorIndexRetriever(
index=index,
similarity_top_k=5,
)
response_synthesizer = get_response_synthesizer(
text_qa_template=prompt,
)
query_engine = RetrieverQueryEngine(
retriever=retriever,
response_synthesizer=response_synthesizer,
# node_postprocessors=[SimilarityPostprocessor(similarity_cutoff=0.7)],
)
response = query_engine.query(query)
return response.response
```
I am running an Obsidian Plugin, a server written in python. I build the plugin using `nix build` and then run it in a shell window. The flask server runs but when I open Obsidian I get the error above. How can I see where this server looks for the missing file?
The error message indicates that the `os.listdir(dir_path)` call in the `index_dir` function is failing because `dir_path` is an empty string. This suggests that the function is being called with an empty string as an argument.
To debug this issue, you can add some logging or print statements to see where the empty string is coming from. Here are a few suggestions:
1. Add a print statement before the `os.listdir(dir_path)` call to see what the value of `dir_path` is:
```python
def index_dir(dir_path):
print(f"index_dir called with dir_path={dir_path}")
if any(fname.endswith(".md") for fname in os.listdir(dir_path)):
...
```
This will print the value of `dir_path` to the console when the function is called.
```shell
File "/nix/store/5vj6780djxgr0r64kysp7jfvy8fcvqmw-python3.11-ollama-obsidian-indexer-0.1.0/lib/python3.11/site-packages/ollama_obsidian_indexer/llm.py", line 105, in index_dir
if any(fname.endswith(".md") for fname in os.listdir(dir_path)):
^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: ''
```
2. Check the call stack to see where the `index_dir` function is being called from. You can use the `inspect` module to get the current frame and print the call stack:
```python
import inspect
def index_dir(dir_path):
print(f"index_dir called with dir_path={dir_path}")
frame = inspect.getframe(1)
print(f"Call stack: {inspect.getframeinfo(frame)}")
...
```
This will print the call stack, including the file and line number where the `index_dir` function was called from.
3. Check the value of `dir_path` in the calling code. If the `index_dir` function is being called from another function, check the value of `dir_path` in that function.
By adding these logging statements, you should be able to see where the empty string is coming from and why the `os.listdir(dir_path)` call is failing.
Additionally, you can also try to add some error handling to the `index_dir` function to handle the case where `dir_path` is an empty string. For example:
```python
def index_dir(dir_path):
if not dir_path:
print("Error: dir_path is empty")
return -1
...
```
This will print an error message and return -1 if `dir_path` is an empty string.
```shell
127.0.0.1 - - [02/Aug/2024 16:58:37] "GET /indexing HTTP/1.1" 200 -
Error: dir_path is empty
```
The error message "Error: dir_path is empty" indicates that the `index_dir` function is being called with an empty string as an argument.
Since you're running a Flask server, it's likely that the `index_dir` function is being called from a route handler. You can check the route handler that handles the `/indexing` endpoint to see where the `dir_path` argument is coming from.
Here are some possible reasons why `dir_path` might be empty:
1. **Missing or incorrect configuration**: Check your configuration files (e.g., `pyproject.toml`, `settings.py`, etc.) to ensure that the directory path is correctly set.
2. **Invalid request data**: Verify that the request data sent to the `/indexing` endpoint contains the correct directory path. You can check the request data using `request.args` or `request.json` depending on the request method and data format.
3. **Incorrect function call**: Review the code that calls the `index_dir` function to ensure that it's passing the correct directory path as an argument.
To debug this issue, you can add more logging statements or use a debugger to inspect the values of variables and function calls.
Here are some suggestions:
1. Add a logging statement to print the request data:
```python
from flask import request
@app.route('/indexing', methods=['GET'])
def indexing():
print(request.args) # or request.json, depending on the request method
dir_path = ... # get the directory path from the request data
index_dir(dir_path)
...
```
2. Use a debugger to inspect the values of variables and function calls. You can use a Python debugger like `pdb` or a IDE's built-in debugger.
By investigating the issue, you should be able to identify why `dir_path` is empty and fix the problem.
To set the `base_path` to `~/Documents/alex`, you can use the `os` module to expand the `~` symbol to the user's home directory.
Here's an example:
```python
import os
base_path = os.path.expanduser("~/Documents/alex")
```
This will set `base_path` to the absolute path of the `~/Documents/alex` directory.
Alternatively, you can use the `pathlib` module, which provides a more modern and Pythonic way of working with paths:
```python
import pathlib
base_path = pathlib.Path("~/Documents/alex").expanduser()
```
Both of these approaches will ensure that `base_path` is set to the correct absolute path, regardless of the user's home directory.
Note that if you want to set the `NOTES_BASE_PATH` environment variable to `~/Documents/alex`, you can do so in your operating system's environment settings or in your `pyproject.toml` file (if you're using Poetry).
For example, in your `pyproject.toml` file, you can add the following line:
```toml
[tool.poetry.env]
NOTES_BASE_PATH = "~/Documents/alex"
```
This will set the `NOTES_BASE_PATH` environment variable to `~/Documents/alex` when you run your Flask app using Poetry.
```shell
File "/nix/store/hy11zy9kcbnzzamvi2xyb0zhg41kc5nw-python3.11-ollama-obsidian-indexer-0.1.0/lib/python3.11/site-packages/ollama_obsidian_indexer/llm.py", line 104, in index_dir
if any(fname.endswith(".md") for fname in os.listdir(dir_path)):
^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: ''
```
Test the following plugins
https://github.com/pfrankov/obsidian-local-gpt Create summary of my latest notes **Summary of Latest Notes** ========================== ### **Project Overview** Our company has been exploring ways to increase customer engagement through innovative marketing strategies. After conducting thorough research and analysis, we have identified the following key points: #### Key Findings ----------------- * The use of social media platforms has increased significantly over the past two years, with 80% of our target audience having a presence on at least one platform. * According to a study by Hootsuite, 71% of online adults aged 18-29 use Instagram, while 56% use Facebook. * A survey conducted by Pew Research Center found that 74% of online adults aged 18-64 have made a purchase online in the past 12 months. #### Marketing Strategy ---------------------- Based on these findings, we propose implementing the following marketing strategy: * **Content Creation**: Develop high-quality, engaging content (blog posts, videos, social media posts) that resonates with our target audience and encourages them to share their experiences. * **Influencer Partnerships**: Collaborate with social media influencers who have a genuine interest in our products or services. This will not only increase brand visibility but also build trust among our target audience. * **Email Marketing**: Establish an email marketing campaign that provides personalized content and promotions to subscribers, increasing the likelihood of conversions. #### Implementation Roadmap ------------------------- To ensure successful implementation, we recommend the following: * **Week 1-2**: Conduct a thorough analysis of our current social media presence and create a comprehensive strategy for improvement. * **Week 3-4**: Develop engaging content (blog posts, videos, social media posts) that resonates with our target audience. * **Week 5-6**: Collaborate with social media influencers who have a genuine interest in our products or services. ### **Budget Allocation** To allocate resources effectively, we propose the following budget distribution: * Content creation: 30% * Influencer partnerships: 20% * Email marketing: 20% * Miscellaneous (travel, equipment): 30% By implementing this marketing strategy and allocating our resources efficiently, we are confident that we can significantly increase customer engagement and drive sales.
As a Developer Advocate I want to have access to personal AI. Personal AI is a LLM model interface that facilitates models learning from its interaction with the user. It learns from users private data and helps user process daily notes, reply to emails, write content, remember ideas and so on. Obsidian is a note taking app that uses markdown and diverse plugin ecosystem. Ollama is a software that lets user get up and running with Llama 3.1, Mistral, Gemma 2, and other large language models; Ollama provides interface to manage llms in the same way as docker is used to manage docker images. ollama provides two models: llama3.1:8b and mxbai-embed-large.