Jpiccoli1691 / 305FitnessCasino

Fitness Casino.
0 stars 0 forks source link

A/B Testing #56

Open Ptiefighter opened 3 days ago

Ptiefighter commented 3 days ago

Format:

A/B Test Name: Make a meaningful name for the test. For example "Signup/Sign In 1 screen or 2 screens", could be a title for an A/B test to determine if users prefer having account creation and login on a single screen, or two screens. User Story Number: For instance, the "Signup/Sign In 1 screen or 2 screens" A/B test would be a task under US1 (Account Creation). Metrics: Your team's HEART metrics that this A/B test measures. Hypothesis: State your hypothesis for this A/B test What problem are we trying to solve? Its impact? (e.g. how big this problem is to our customers?) In formulating the hypothesis, first you need to define the problem you want to solve. For example, you are an SaaS that offers free trial and you want to improve Adoption. But that problem might be too broad to form an A/B test as you can simply test one variable in an A/B test to be effective (otherwise you won’t know which variable is causing the change). So to narrow down the problem you want to solve, you need to find out the bottle-neck in the conversion funnel – where do people drop off the most? Are there any key information or call-to-action buttons that you expect people to read/click but they didn’t? After narrowing down the problem you want to solve, you then need to make a hypothesis as to what causes those bottlenecks and what you can do to improve. For example, you noticed most of the visitors will visit your “Features” page but very few of them will actually scroll past even half of the page so many features that you think are important are not actually viewed by the visitors. To improve this, one hypothesis might be using tab or toggle list design to make your page shorter and visitors can select to dig deeper into content that they are interested in by expanding the content. Remember when formulating your hypothesis, change only one variable so that you will know it’s really that variable that is causing the change in conversion.. Experiment - Detail out the experiment setup that you will use to test your hypothesis using Firebase capabilities. Describe the audiences – will all users be able to view the experiment? Or you will only allocate x% of your user base to the experiment? Lay out the details with the rationale behind this decision. Describe the tracking using Firebase Analytics. With your HEART metrics, what tracking needs to be set up?

Each team member must have a separate A/B test committed to the ABTEST.md file.

Jpiccoli1691 commented 1 day ago

A/B Test Name: "Color Scheme Impact on Engagement and Task Success"

User Story Number: Color Changes(User Interface Customization)

Metrics:

Happiness: Positive feedback on app aesthetics, Net Promoter Score (NPS). Engagement: Average session length, page visit counts. Task Success: Button action events, search exit rate. Hypothesis: If we apply a lighter color scheme to the app, then users will find it more visually appealing, leading to longer session times and increased interaction with app features, as lighter colors may improve readability and perceived usability.

Problem & Impact: Problem: Some users find the current color scheme unappealing or hard to navigate, potentially impacting their willingness to interact with the app. Impact: The current color scheme may deter users from staying engaged, which could result in reduced session lengths, lower engagement with app features, and overall diminished satisfaction.

Experiment Setup:

Audience: 50% of the total user base will see the new color scheme, and the other 50% will continue with the existing scheme. Tracking via Firebase Analytics: Average Session Length (Engagement): Measure any changes in time spent per session between both groups. Button Action Events (Task Success): Track button interaction rates for both groups to gauge user comfort with the UI. Page Visit Counts (Engagement): Monitor whether there’s an increase in users exploring different pages with the new color scheme. Survey Results (Happiness): Post-experiment survey asking for user feedback on app design and color preference. Rationale: Testing with the full user base lets us see how existing and new users respond to the color change, providing a broad understanding of user preferences and impact on engagement.

Variations:

Variation A: Original color scheme (e.g., darker color palette). Design: Current color layout with all default elements. Variation B: New color scheme (e.g., lighter color palette with higher contrast). Design: Redesigned elements with lighter background tones, improved contrast, and easier readability. Mockups and diagrams for both color schemes will help illustrate each variation’s interface and assist in user feedback collection.

Mcpmatt commented 9 hours ago

A/B Test Name: In-Game Store vs. Profit Tracker for User Retention

User Story Number: US4 (The Golden Path)

Metrics:

What problem are we trying to solve? Its impact?

Problem: Users may lose interest if the app lacks engaging incentives for continued play. While tracking profit is informative, it may not provide enough motivation for users to keep playing and earn tokens.

Experiment:

1. Audience:

2. Tracking in Firebase Analytics:

Variations:

1. Variation A: In-Game Rewards Store

2. Variation B: In-Game Profit Tracker