βπͺ β February Product Flow #1
Replies: 2 comments 1 reply
-
Hey team -> got some discussion around the data types & how we'll be presenting them across the
https://elianna.notion.site/Dataset-structure-Discussion-42d23cf1ced447138c6ffef1c01df286 Based on my current understanding and very limited scope so far, I'm storing things in a database just as fields. Most of the discussion/examples refer to planets simply because that's what we've built up so far There's a few limitations to this:
Ultimately, I think we can build out the initial system with these limitations for now as it will only be us who's using & building on the platform, which means we can assume that there won't be any malicious intent, and there won't be so much data that it becomes impossible to fix mistakes. Once we have a better understanding of how we'll migrate this over to Nodes we'll hopefully get closer to solving this issues/limitations. I think the way the data will be stored will mostly remain the same -> either datasets in the form of tables, or preferably modules like Lightkurve that can then be imported (maybe we could look for a tool that could automate this process -> storing the data directly on IPFS anyway). Long-term we only will have user data & unpublished content existing on the relational off-chain DB so I think we'll be fine with this model. However, some research on the frontend & API structure will need to be put in as well to ensure that regardless of what model/dataset is being used, anything can be queried by the API and the UX/UI design remains consistent regardless of how many fields or what fields are being displayed In short, data should be presented in tabular format if possible -> Field Name, Field Content. For every collection/dataset/article we work with, we should write a standard for what each table should consist of so that I can set that up on the backend. One day soon I hope to provide a direct link to this comment - for now, if you spin up |
Beta Was this translation helpful? Give feedback.
-
I've been working on the action items over the last few days and I've come up with the next steps: Here's the gameplay info (comments and code snippets are available on above Notion link) GameplayEntry point & early user flowClaim their first anomaly Explore it (datasets)
Image generated should be moved here as well Write an article about it Tag anomaly id This should then show up in βarticle refsβ Once an article has been written (which is basically mimicking the classification process), allow users an opportunity to mint a token, and then display the specific staking action (currently the Return the id of the new token, add that to the supabase anomaly row. Note that as described in tag pages here when online, anomalies are all created as 1155s (lazy minted), and then once classified that 1155 is βburntβ and turned into a 721 Update the userβs reputation, thus allowing them to mint (claim) another anomaly |
Beta Was this translation helpful? Give feedback.
-
February Flow - Planets
Notion
PageThis discussion is mapping out the infrastructure planning, talking points & questions/updated for the Planet & Metadata infrastructure (DeSci-md/automating-metadata#1).
Referencing: Signal-K/sytizen#16, Signal-K/sytizen#18, Signal-K/Silfur#31, Signal-K/client#19
As of the 11th of February, we have the following fragmented components in the
client
ecosystem:three
.js).fits
format) from a graph generated by a Lightcurvepsycopg2
β Can interact with the Supabase clientCurrently the Lens Protocol is using Chakra UI, while the social graph is using
tailwindcss
(included custom/plain CSS files). I want to update the Lens Feed to use the<Card />
component defined in thesocial-graph
./styles
dir. Today, I re-integrated the Lens Feed View & graphql headers into the social graph inwb3-12-add-...
on.signal-k/sytizen
. Hopefully we can use one basic set of CSS rules long-term, each fragment/section (aka component sort of? referring to large sets of components here really) might be initially designed using something like Chakra or Tailwind but long-term we need to migrate it to one standard to keep thepackage.json
as small as possible - might be something else for DAVID to look at.Still to be added:
traders
branch onsignal-k/sytizen
(and the immediately derived branch[es]) for some initial UI work on what I imagine a draft UI for this would look likeprofiles
table, and this can then be added into a βdumpβ transaction on a breadboard module on-top of Lens publication. For example, if a user βmintsβ a planet, theyβll just be getting a planet sent to their inventory (the mint button will be interact with a lazy mint, but data will be sent to postgres for the NFT ID and match that with thesession?.user?.id
tag, so that NFT id is belonging to the user in the off-chain instance but to a default (e.g.u/parselay.lens
oru/g1zmotronn.lens
) eth wallet on Goerli/Mumbai/Polygon. That planet can then be interacted with off-chain - e.g. adding a building. This offchain interaction [βtransactionβ] is then stored in Supa, and can be then added to an offchain post on the social graph. All relevant (for a single anomaly/object) social graph posts will then be sent to a single Lens publication that tags those arrays but is published by the default eth bucket/address (pre-MVP)Re-integration of on-chain interactivity will occur once the basic UI for the gameplay has been initialised. Currently there is debate internally about whether we use the generated graphql hooks or just the default Lens contract ABI (from
@lens-protocol
node.js package). We also need to map out the full smart contract & ERD infrastructure later in March DAVID.Beta Was this translation helpful? Give feedback.
All reactions