Xatta-Trone / medium-parser-extension

Read medium.com and medium based articles using google web cache.
MIT License
1.36k stars 79 forks source link

Not able to view the code #14

Closed yhjJean closed 11 months ago

yhjJean commented 11 months ago

Able to unlock the article to view the image, words, but not able to view the code (if any)

Xatta-Trone commented 11 months ago

please give me of the original article here

Xatta-Trone commented 11 months ago

@yhjJean

yhjJean commented 11 months ago

https://betterprogramming.pub/how-to-make-a-cross-platform-image-classifying-app-with-flutter-and-fastai-2a6af6701535

Xatta-Trone commented 11 months ago

In the google web cache, the iframes are not supported. That is why I added archive proxy for secondary support (which works in most of the cases, but in your case it is not working). So, I will copy and paste all the codes for you here.

Xatta-Trone commented 11 months ago

Building the Web App

While building the web app, we will mostly be focusing on the server.py file in the repository that you have already forked.

It’s the server.py code that is the heart of the application.

The lines that concern us are the following:

export_file_url = 'DOWNLOAD_URL'#Download_url for the .pkl file
export_file_name = 'export.pkl'
classes = ['sweep', 'coverdrive', 'straightdrive', 'helicopter', 'scoop', 'pull']

Where it says DOWNLOAD_URL, simply paste the download URL of your export.pkl file and replace the classes with the labels of your own model. The app is now deployment-ready!

However, there are still some cosmetic things that need to be changed. In the view folder, edit the index.html file to represent your model and what your model does, which is probably not the classification of teddy bears!

What’s All the Other Stuff in the Code?

Now that we have made sure that the app will function, before telling you how to deploy it, I’ll walk you through what the code actually means. We’ve already covered the export_file and classes code, and the imports are fairly self-explanatory. We will begin by examining the following block of code:

app = Starlette()
app.add_middleware(CORSMiddleware, allow_origins=['*'], allow_headers=['X-Requested-With', 'Content-Type'])
app.mount('/static', StaticFiles(directory='app/static'))

async def download_file(url, dest):
    if dest.exists(): return
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:
            data = await response.read()
            with open(dest, 'wb') as f:
                f.write(data)

async def setup_learner():
    await download_file(export_file_url, path / export_file_name)
    try:
        learn = load_learner(path, export_file_name)
        return learn
    except RuntimeError as e:
        if len(e.args) > 0 and 'CPU-only machine' in e.args[0]:
            print(e)
            message = "\n\nThis model was trained with an old version of fastai and will not work in a CPU environment.\n\nPlease update the fastai library in your training environment and export your model again.\n\nSee instructions for 'Returning to work' at https://course.fast.ai."
            raise RuntimeError(message)
        else:
            raise

Although this seems like quite a lot to digest, what it does is fairly simple. First, it initializes the app, then it mounts the static folder to get all the CSS and Javascript for the web app, and finally, it defines two functions. Each of these two functions is very important to the web application as a whole. The first, download_file, will be used to download your machine learning model, while the second, setup_learner,sets up the machine learning model you created.

Moving on to the next code block:


@app.route('/')
async def homepage(request):
    html_file = path / 'view' / 'index.html'
    return HTMLResponse(html_file.open().read())

@app.route('/analyze', methods=['POST'])
async def analyze(request):
    img_data = await request.form()
    img_bytes = await (img_data['file'].read())
    img = open_image(BytesIO(img_bytes))
    prediction = learn.predict(img)[0]
    return JSONResponse({'result': str(prediction)})

This code block defines the two routes of the application. One is the homepage which returns the index.html file as an HTML response. The next route, /analyze, is very important, as this is where all the magic happens. What it does is collect an image through the form present in the index.html file, applies the model to the image, saves the prediction, and finally returns the prediction as a JSON response. This will make it a whole lot easier for us when we get to developing the mobile app.

Deploying the Web App

Now, we can’t use the web app we created as an API unless we deploy it, so let’s get to it!

[Go to render.com](https://dashboard.render.com/) and sign in with your GitHub account.
Create a new web service and use the repository that we have been using throughout this tutorial.
Choose Docker as the environment and name your service.
Click save web service.

Yay! We’re done deploying it — pretty easy, right? Now while you wait for Render to get your web app up and running, let’s get to work on the mobile app. Of course, if you don’t want to build the mobile app, you can stop here and come back if you ever want to go mobile!

Getting Ready to Build the Flutter App

What you’ll need:

Android Studio
Flutter SDK
AVD Manager (Installed alongside Android Studio)

If you don’t already have Android Studio prepped for flutter development, follow this tutorial: Set up an editor

Alright, it is time to begin!

Setting Up Dependencies

Open up Android Studio and start a new Flutter project. Creating a new Flutter project

Now open up the pubspec.yaml file and edit the dependencies section to look like this:

dependencies:
  flutter:
    sdk: flutter
  image_picker:
  http:

Now open up the terminal and run the following command:

flutter pub get

This will make sure that all the necessary packages are installed and that the code won’t have any import errors.

Making the App

To create the app, we need to edit the main.dart file. Replace the current code in main.dart with the following:

Change the base URL to the URL of the Render service you had created earlier and customize the rest of the app as you wish.

Run the above code on your android virtual device, and it should be functioning well!

Here’s how the code for this looks on my screen; obviously, you might need to edit the text to represent your model better: Screenshot 2023-12-06 at 01-22-09 How to Make a Cross-platform Image Classifying App with Flutter and Fastai

We’re done! You just built a cross-platform mobile app that can classify images!

If you liked this article, consider signing up for my newsletter to receive great content every Sunday: https://mailchi.mp/35c069691d2c/newsletter-signup.

Xatta-Trone commented 11 months ago

list to all the files: https://gist.github.com/siddhantdubey

@yhjJean

yhjJean commented 11 months ago

Thank you so much !