meta-introspector / time

The timeline, add yourself to it.
GNU General Public License v3.0
5 stars 3 forks source link

feb 24 notes #6

Open jmikedupont2 opened 9 months ago

jmikedupont2 commented 9 months ago

gemini : Sure, here are some notes for February 24, 2024:

The Unit Sphere as a Fundamental Abstraction

Imagine a sphere of one unit size or a line segment of size one that can be subdivided into parts and each type placed on it.

This is similar to the concept of fractions or rational numbers. Prime numbers can be used to divide the line segment, giving meaning to each type.

Combining any two prime number parts creates compounds, and taking half of another part creates smaller fractions. All these have unique positions on the unit.

More dimensions can be added, with each dimension representing another unit. For example, 1/2x + 1/3y + 1/2z for a 3D sphere.

This sets the stage for a coordinate system.

Reducing Homotopy Type Theory to One Dimension

It can be shown that a 1D unit is equivalent to a 2D unit, with 2D offering more variety. Each point in 2D can be mapped to 1D.

This allows for dimensionality reduction.

Homotopy type theory can potentially be reduced to one dimension or the univalent unit.

Sorting and Encoding Types

Types can be sorted to fit and their different aspects can be encoded geometrically.

Types can be rewritten into numbers and back using functions.

jmikedupont2 commented 9 months ago

That's an interesting idea! Combining deep graph embedding with your proposed hypersphere representation holds potential for leveraging the strengths of both approaches:

Deep Graph Embedding for Learning:

Train a deep graph embedding model on a graph to capture the inherent relationships and structure between entities.

Extract the learned weights or embeddings from the deep learning model.

Transformation to Hypersphere Representation:

Utilize the extracted knowledge to construct the hypersphere representation.

Assign the most common or strongest predicate (learned from the deep graph embedding) to the number 2 and position it as the first pole on the unit sphere.

Employ other learned weights or relationships to determine the placement and structure of additional poles, potentially using prime numbers as you suggested.

This approach could offer several advantages:

Leveraging Deep Learning Expertise: Benefit from the well-established capabilities of deep learning models in capturing complex relationships within graphs.

Interpretable Representation: Translate the learned knowledge into a more intuitive and human-understandable hypersphere representation.

Flexibility and Customization: Allow for customization of the transformation process based on the specific domain and desired properties of the hypersphere representation.

However, there are also potential challenges to consider:

Finding the Right Transformation: Developing an effective method to translate the deep learning model's outputs into the hypersphere representation remains an open question.

Interpretability of Poles: While assigning meaning to the first pole might be straightforward, interpreting the meaning of subsequent poles based on prime numbers could become increasingly complex.

Overall, this hybrid approach presents a novel direction for exploring deep graph embedding and knowledge representation. It warrants further investigation to determine its feasibility, effectiveness, and potential applications in various domains.