Dakota Keyser is an ecommerce leader in the digital retail and fashion industry. He started his career in digital marketing within paid search and SEM at iProspect, an advertising agency. Dakota then transitioned to digital analytics at JCPenny before spending nearly seven years at Madewell, a J.Crew brand. Most recently, he pivoted to work under the broader J.Crew umbrella and served as Director of Ecommerce Operations and Digital Strategy.
In our conversation, Dakota talks about how he helped implement a data-driven culture in teams across a company. He talks about how he uses historical data to size up opportunities for upcoming product trends, as well as how lightweight personalization efforts positively impact the customer experience.
I studied communication in college. After graduating, I worked at a small SEO agency, and later transitioned to a paid search advertising agency. I was writing ad copy and running small A/B tests, but I found building dashboards in Excel to be the most exciting part of my job. Ultimately, this led me to work as a site analyst at JCPenney. This is where I started to dive into the technical components of digital analytics, such as working through re-implementations and writing business requirements and technical specifications. It was so exciting for me.
After that, I moved into another analytics-focused role at Madewell, which is part of the J.Crew Group. At the time, the company was re-platforming the website, which led me to own customer personalization. This involved A/B testing, search engine optimization, and creating sorting rules for category merchandising. I ended up doing all of these things for three different J.Crew group brands down the line.
Eventually, I fell in love with building things for the customer. I love taking in user feedback and creating relevant experiences with available technology. That’s how I arrived at the intersection of analytics, building, and customer feedback.
Fostering a culture of data-driven experimentation requires vertical and lateral evangelization. First, I make sure that my team is using data to inform their decisions more often than relying on intuition or creativity. I encourage them to be very mindful of how they’re synthesizing the data. Finally, I make sure that they’re validating insights using summative research.
In a more literal sense, I encourage everybody — regardless of their level and whether or not they’re on my team — to share ideas and back them up with data. That helps to make data-driven experimentation the ethos of the whole company, not just that of my specific team. We don’t test everything because some things don’t make it through the prioritization matrix filter, but everybody has a voice. In addition, organization-wide transparency helps us to see how ideas stack up against each other.
Many retail apparel companies are creatively-driven, so I found it important to continue promoting data as a vehicle of success. My goal is always to ingrain data into the company culture by showing various stakeholders what to do with it. At J.Crew, it was especially important to also explain why we were doing certain things with data and circle back to share outcomes.
I use an impact effort matrix, which is a four-quadrant diagram that visualizes the potential impact and effort of all the gathered ideas. This enables my team to prioritize the ideas that end up in the high-impact, low-effort quadrant. You need to intuitively understand the resources required for each idea — that’s why it’s crucial to be in a multi-disciplinary role if you’re looking to do this exercise by yourself. Once you’ve completed the matrix, you can validate your estimations with data and get then buy-in from your leadership team.
Yes, and it usually comes down to the resources, bandwidth, and overall effort required to solve that problem. I’ll continue advocating, but if the team can’t prioritize something, I’ll go back and estimate what the problem costs from a financial perspective. I then use that as an input to counterbalance some of the business objectives.
At times, larger strategic initiatives may trump what I’m advocating for, so it’s important to have faith in leaders and believe that they’re taking teams on the right path.
This is a fun question because I like to think of this approach as a long spectrum — people can either be very heavyweight or very lightweight with it. Some people love to go straight in with the heavyweight approach, which works if you’re looking for a process that’s robust and repeatable. To go down this route, you’ll need hefty technical systems that connect your CDPs and guarantee good data integrity. You’ll also need data engineers to architect these systems.
I tend to prefer a mid-weight approach, which involves putting the data I need into tools like Excel and joining them up with a data visualization tool like Power BI. This ad hoc approach is more adaptable and scalable. I’ve found that as we get a deeper understanding of the data we have, we can get insights faster.
The easiest way I’ve accomplished this is with a post-purchase or site intercept survey. If you keep the surveys simple, use them consistently, get feedback at different touch points, and link feedback to the user’s session data, you can easily dig into the problems that customers are facing. After that, you can then use the prioritization matrix that I mentioned earlier to prioritize solving the high-impact, low-effort problems.
In a previous role, I worked for a company that heavily utilized coupon codes. The senior leader of ecommerce reached out to me wondering why coupon code usage was so low given that almost every product was eligible for one. I dug into the customer feedback and found that people couldn’t easily find the codes, and that often led to them dropping off and not finishing their purchases.
As a result, we ended up building an auto-apply coupon code functionality that always gave the customer the best possible price. This was a meaningful change that met the executive’s goals of increasing the number of customers, and also fulfilled the customer’s goal of making it easier to evaluate the savings of different coupon codes.
Even though customers using coupon codes meant smaller cart totals, in aggregate, they could be spending more with us than they would have originally. Plus, it avoided the possibility of the company having to sell these products at a deeper discount in the future. There’s definitely a lot to consider, but the real nugget is that we tested multiple variants to see what gave us the best result.
A lot of brands do a final, no-return sale once the product has a steep enough discount. This is a very effective method for reducing returns, but it also creates customer friction. It can reflect poorly on the brand if you’re not allowing a customer to return a product.
The key to reducing returns is to be as transparent as possible using social proof and metadata. When I say metadata, I’m talking about product attributes like the weight and stretch of the fabric, or other relevant metrics like the number of people viewing the item. Sizing is a big one — things like body type, height, weight, and general fit. I’m a huge advocate of providing enough sizing information to inform the customer without directly telling them what size they should purchase.
We can use data to decide what information to provide. Many brands are using generative AI to consolidate customer feedback and put more relevant information on the page. Ultimately, you could say something along the lines of “This is a sheer product, so if it’s a little too see-through for your taste, here’s something else.”
In particular, make sure that you’re collecting and assessing customer feedback with your returns so you can get ahead of customer pain points. That way, you can even use the insights when launching new products.
I love looking back at historical data using Google Trends, which can help to inform timing. I’ve also used similar tools to size up opportunities for upcoming product trends. In addition, my team works with vendors who will give quarterly presentations and show us the things they think are going to be popular.
Nowadays, I tend to focus more on the here and now. For example, let’s say that we’re developing a new pair of shorts for the spring, and we’re seeing a rise in popularity for shorter inseams, we can deepen the assortment in this fit. We still need to be mindful of whether the trend is relevant for our customer base, and whether it is sustainable from a financial sense. though. Before we invest in a trend, we want to be mindful of determining whether it makes sense for our business.
Lastly, we also use our own intellect to weed out what doesn’t make sense to us. We don’t necessarily want to jump on a trend that might disappear in three months just because everybody else is doing it.
The prioritization matrix acts as the starting point. But I think the other piece of the success with A/B testing comes from taking data, customer feedback, product expertise, UX expertise, and collaboratively building a product.
Specifically, when it comes to A/B testing, I try to find the right balance of testing small tweaks and large product changes. My team spends a lot of time refining the UX by going through a review process and having collaborative sessions between myself and someone from digital analytics, UX, frontend engineering, etc., so that we can plan and build great experiences.
When we’re brainstorming, I make sure to not tell my designers and developers what to build and how to build it. Instead, I explain what we’d like the outcome to be while handing over any tools, data, and insights that we have at our disposal. I find that this helps us architect more mindfully — as opposed to copying something that somebody else has built. That’s what really brings about success.
Naturally, we also make sure that we continuously measure the impact of these changes and make even more improvements, which is a lot of effort but always ends up being highly valuable.
Personalization can mean many things. When I approach personalization, I’m not looking at visitors one-to-one. Instead, I’m looking at historical data that we have about visitors similar to them, such as folks from the same location or who arrived on a given website in the same way.
The way that visitors engage on a website, products, and discounts will vary depending on whether they come from an affiliate coupon website, a blog, or a search engine. And if I have this context, I can put different things in front of them, especially if we’re in a big promotional period.
If someone came to us from organic search, then they already know what they’re looking for, in which case I would tailor their search experience. If they were referred via an affiliate URL, I know that they want to use a coupon, so I’ll display a coupon banner at the top of the page and hit them with that information in a few other places too.
By approaching personalization in this way, it’s easier to be more discreet. It doesn’t appear to be targeted, so that’s another benefit of what I would call lightweight personalization.
My go-to is following vendors on LinkedIn and reading the articles they post. Conferences like Shoptalk, Adobe MAX, Summit, and CommerceNext are also super helpful. On the inspiration side of things, I like seeing what other brands are doing on their websites.
I also stay sharp by being curious and finding the time to test new things, like ChatGPT or Google Data Studio. I always recommend that people make an effort to maintain their curiosity throughout their careers.
LogRocket identifies friction points in the user experience so you can make informed decisions about product and design changes that must happen to hit your goals.
With LogRocket, you can understand the scope of the issues affecting your product and prioritize the changes that need to be made. LogRocket simplifies workflows by allowing Engineering, Product, UX, and Design teams to work from the same data as you, eliminating any confusion about what needs to be done.
Get your teams on the same page — try LogRocket today.
Want to get sent new PM Leadership Spotlights when they come out?
To help demystify stakeholder management, you can use tools that introduce a structured approach for your product team.
Role-definition frameworks like RACI, RAPID, and RASIC take a structured approach to assigning roles and clarifying accountability.
Jann Curtis talks about understanding your audience’s purpose and what they hope to get from the conversation.
Scaled agile is an approach that allows you to extend agile principles across multiple teams, projects, or business units.