dipdup-io / dipdup

Modular framework for creating selective indexers and featureful backends for dapps
https://dipdup.io
MIT License
98 stars 53 forks source link

Run one of demos' tests with PostgreSQL #1105

Open droserasprout opened 2 months ago

droserasprout commented 2 months ago

All integration tests in test_demos.py are run on SQLite. It would be nice to run any of them on Postgresql to check related code. Use dipdup.test.run_postgres_container helper to spawn a temporary container; see test_hasura.py for example.

Dprof-in-tech commented 2 months ago

I am applying to this issue via OnlyDust platform.

My background and how it can be leveraged

Hello, i am Dprof-in-tech, an experienced Full Stack Blockchain Developer and I am excited to contribute my skills to this project in this ODHACK 8. With a strong background in Next.js, TypeScript, JavaScript, React, Node.js, Python, Rust and Cairo, I've honed my technical skills across the blockchain development landscape.

My journey with OnlyDust began at Edition 2, and I've since made 34 contributions across 11 projects. This extensive experience on the platform has allowed me to develop a keen understanding of delivering high-quality solutions under tight deadlines. I bring a unique blend of technical prowess and user-centric design to every project, whether I'm crafting immersive 3D experiences or developing innovative smart contracts.

My track record demonstrates my ability to adapt quickly and contribute effectively to diverse challenges. I'm confident in my capacity to tackle new problems and drive innovation in the blockchain space. As we begin ODHACK 8, I'm eager to leverage my hackathon experience and technical skills to push the boundaries of what's possible in blockchain development.

Below is a link to my OnlyDust public profile. https://app.onlydust.com/u/Dprof-in-tech

How I plan on tackling this issue

Approach for Running One of the Demo's Tests with PostgreSQL (#1105):

I will review the existing integration tests:

I will start by exploring the file test_demos.py, which contains the integration tests that are currently being run using SQLite. This will allow me to identify which test(s) would be most suitable to run on PostgreSQL. I will choose a test that has a clear relationship to database operations, ensuring we effectively test the PostgreSQL-related code.

I will review the PostgreSQL test setup: To understand how the project uses PostgreSQL in tests, I will review the existing test setup, particularly the run_postgres_container helper function provided by dipdup.test. This function allows the spawning of a temporary PostgreSQL container for running tests. Additionally, I will reference the test_hasura.py file, which uses this helper function. This will help me understand how to configure and manage PostgreSQL containers within the test suite.

I will modify the selected demo test to run with PostgreSQL: Once I identify the demo test to be run with PostgreSQL, I will modify it to use the run_postgres_container helper function.

I will update the test setup so that the PostgreSQL database is initialized and the necessary schemas and tables are created before the test execution.

The test teardown will ensure that the temporary PostgreSQL container is stopped and cleaned up after the test run.

I will handle any SQLite-specific code: If the selected test contains any SQLite-specific code or assumptions, I will refactor it to work with PostgreSQL. This might involve: Modifying SQL queries or schema definitions to be compatible with PostgreSQL. Updating database connection configurations within the test.

I will ensure the refactoring does not interfere with the existing SQLite-based tests, preserving compatibility with both databases.

I will run the test locally: I will first run the modified test locally using PostgreSQL by invoking the run_postgres_container helper function. I will confirm that the PostgreSQL container is spawned, the test runs successfully, and any issues are resolved. I will ensure that after the test, the PostgreSQL container is properly cleaned up, preventing resource leaks.

I will confirm compatibility with the rest of the test suite: After successfully running the test with PostgreSQL, I will rerun the full test suite to ensure that the addition of the PostgreSQL test does not introduce any regressions or interfere with existing tests.

I will submit the changes for review: Once the test is running successfully on PostgreSQL, I will submit a pull request with the changes. In the PR, I will explain how I modified the demo test to run on PostgreSQL, how I used the run_postgres_container helper, and any additional setup required for future tests using PostgreSQL.

I will also request feedback from the maintainers regarding the approach and make any necessary adjustments based on their input.

Estimated Timeline: Start Date: Immediately upon assignment. Estimated Completion Date: 2 days from the start date (ETA: 2 days).

Jayse007 commented 2 months ago

I am applying to this issue via OnlyDust platform.

My background and how it can be leveraged

I would like to take this task, to be delivered between (26, September, 2024 - 31, september, 2024). My name is Shawon James. I am a programmer with intensive knowledge in SQL (Postgresql, SQLite), Python, Javascript. I mostly specialized in backend development with the django package python offers. I believe as a backend developer, consistently working with handling of data, testing the database to see that it is receptive of the required information; this makes me more than qualified to handle tasks.

How I plan on tackling this issue

I would run all necessary tests required and give you feedback as to where there is issue with receiving some certain type of data. I would document all issues observed and organize them well and ensure that it is unambiguous.

fabrobles92 commented 2 months ago

I am applying to this issue via OnlyDust platform.

My background and how it can be leveraged

In my day to day I am a full stack engineer with experience in Django, FastApi and plain python, also I have plenty of contributions to web3 projects where that experience can help me deliver a quality solution

How I plan on tackling this issue

I would follow this steps:

MPSxDev commented 2 months ago

I am applying to this issue via OnlyDust platform.

My background and how it can be leveraged

Hello, I am Manuel, a process engineer and web3 developer. I have participated in Starknet Bootcamps, ETHGlobal and am an Elite winner of Speedrunstark. I have a high capacity to solve problems. I am a member of the DojoCoding community. I recently implemented an entire search engine in PostgreSQL, I have knowledge. I hope this issue is assigned to me. I am available to work immediately to achieve what is required in the shortest time possible.

How I plan on tackling this issue

  1. Update test_demos.py to run integration tests on PostgreSQL in addition to SQLite.
  2. Use the dipdup.test.run_postgres_container helper to spawn a temporary PostgreSQL container for the tests.
  3. Reference test_hasura.py as an example for setting up and using the PostgreSQL container.
  4. Modify the test setup to ensure compatibility with both SQLite and PostgreSQL, focusing on the related code differences.
  5. Run the tests on PostgreSQL and ensure that any potential issues or discrepancies are identified and resolved.
Jayse007 commented 2 months ago

Thank you so much for the opportunity. I will not disappoint

Jayse007 commented 1 month ago

Please, how can I get in touch with you @droserasprout

droserasprout commented 1 month ago

@Jayse007 Please join our support group in Telegram

binayak9932 commented 3 weeks ago

Background

I'm a backend developer with strong experience in Python, SQL (PostgreSQL, SQLite), specializing in data handling and testing with Django. I have a solid track record in backend development, ensuring databases handle and store information accurately and efficiently and contributed to OS .

Approach

I plan to run all necessary tests, document any issues found, and provide clear, organized feedback to address any data handling inconsistencies.