sa-paul / Academic_Question_Answering-AQA-_KDD-2024

A model capable of retrieving the most relevant papers to answer questions from various domains. Given professional questions and a pool of candidate papers, the objective is to retrieve the most relevant papers to answer these questions.
0 stars 0 forks source link

Updates & Resourses #1

Open sa-paul opened 3 months ago

sa-paul commented 3 months ago

Post Updates and Resources:

Official Website for KDD 2024: KDD'24 and Guidelines-ACA

LLM RoadMap:

  1. 5 days LLM blog
  2. LangChain for LLM AppDev
  3. LangChain chat with your data
  4. Sci-BERT
  5. Transformer-Neural-Network
  6. Transformer Architecture

General Work-Flow:

WhatsApp Image 2024-03-26 at 8 39 18 PM-2 WhatsApp Image 2024-03-26 at 8 39 18 PM

Swagata2003 commented 3 months ago

Implementing code

from transformers import BertTokenizer, BertForQuestionAnswering
import torch

# Load SciBERT tokenizer and model
tokenizer = BertTokenizer.from_pretrained('allenai/scibert_scivocab_uncased', do_lower_case=True)
model = BertForQuestionAnswering.from_pretrained('allenai/scibert_scivocab_uncased')

def answer_question(question, passage):
    # Tokenize inputs
    inputs = tokenizer.encode_plus(question, passage["info"], return_tensors="pt", max_length=512, truncation=True)

    # Get model outputs
    outputs = model(**inputs)

    # Extract start and end logits from model outputs
    start_logits = outputs.start_logits
    end_logits = outputs.end_logits

    # Get the most likely answer span
    start_idx = torch.argmax(start_logits)
    end_idx = torch.argmax(end_logits) + 1  # Add 1 for exclusive end index

    # Decode the answer span tokens
    answer_tokens = inputs["input_ids"][0][start_idx:end_idx]
    answer = tokenizer.decode(answer_tokens)

    return answer

# Example usage
question = "What do we know about supersymmetric domain walls?"
passage = {
    "paper": "hep-th/9201001",
    "from": "zuber@poseidon.saclay.cea.fr",
    "author": "C. Itzykson",
    "date": "Tue Dec 31 23:54:17 MET 1991 +0100",
    "title": "Combinatorics of the Modular Group II: the Kontsevich integrals",
    "authors": ["C. Itzykson", "J.-B. Zuber"],
    "comments": "46 pages",
    "journal_ref": "Int.J.Mod.Phys. A7 (1992) 5661-5705",
    "info": "We study supersymmetric domain walls in N=1 supergravity theories, including those with modular-invariant superpotentials arising in superstringcompactifications. Such domain walls are shown to saturate the Bogomol nyi bound of wall energy per unit area. We find \sl static \rm and \sl reflection asymmetric \rm domain wall solutions of the self-duality equations for the metric and the matter fields. Our result establishes a new class of domain walls beyond those previously classified. As a corollary, we define a precise notion of vacuum degeneracy in the supergravity theories. In addition, we found examples of global supersymmetric domain walls that do not have an analog when gravity is turned on. This result establishes that in the case of extended topological defects gravity plays a crucial, nontrivial role."
}
print("Answer:\n")
print(answer_question(question, passage))
sa-paul commented 3 months ago

RoadMap to ACA:

acaLLM-IKDD24

sa-paul commented 3 months ago

Base Line model for the above project:

OAG-BERT

sa-paul commented 2 months ago

Google Colab Link:

ACA_with_given_dataset

Outcome:

  1. Runtime crash due to insufficient RAM (Default given 12GB) shown => Hardware constrained
  2. Code works without error
  3. Got Colab RAM crash error - Fine-tuning RoBERTa in Colab, Minimum GPU size for training Roberta etc. Suggested: need 64gb ram to do so
  4. Extending RAM for free: FAILED image