Julie Acosta is Director of Ecommerce and Marketing Analytics at NOBULL, a footwear, apparel, and accessory brand. She began her career as a marketing analyst and then transitioned to sales roles while at Aviall, a Boeing company. Prior to her current role at NOBULL, Julie spent 10 years at AutoZone, where she worked in category management, ecommerce, and digital analytics.
In our conversation, Julie reflects on her efforts to unlock an organization’s curiosity by promoting a data-driven culture. She shares examples of times she’s used data in context to enhance engagement and reduce friction. Julie also discusses how she bridges the gap between the online and in-person retail experience.
I stumbled into the world of analytics through different jobs that put me in a numbers-oriented role, but I have a background in marketing and advertising. One of the most fascinating things about customer journey analytics is understanding customer intent, but it is like solving a puzzle. Someone who works in this area has to be massively curious and interested in understanding not only their own actions as a consumer but also what others do as consumers.
There’s an empathetic component, too — trying to understand what makes sense. But not everybody has that mindset. How do you sell the importance of understanding your customer and their intent, and the value that provides to the business if people within the business don’t inherently understand it? So, one of the traits that I always look for when hiring for my team is curiosity. That’s something that’s hard to teach — it is innate for a lot of people — other qualities you can develop over time.
The biggest challenge for me was how to consistently drive that passion across all business teams. It took a lot of negotiating with leadership to help drive that message top-down, and that only works if you have data-driven leadership.
As marketers, we’ve become a lot smarter, and in some cases have had to reevaluate how we track things due to increased privacy concerns and compliance. The first step was to make sure that leadership was aligned and understood the main KPIs we would be tracking. Leadership had to be the ones who set the tone for the teams and say, “These are the things we’re going to track. Have these in your back pocket at all times. You need to have your baseline understanding.”
From there, we had high-level KPIs. Now, if you’re in digital marketing, you have to understand what your channel is specifically doing for the business. And then you can understand how that point of view fits into that higher KPI or that bigger, more aggregated view. And so when we, as a team, started creating that culture of being more data-driven, it made it a whole lot easier.
I run enablement sessions, which we call analytics days. I got this idea from a candidate I interviewed; he used to do this at Microsoft. We started these sessions at my previous job, and they were such a joy. We held sessions three years in a row and had an average of about 150–200 people engaged for the whole day.
It was amazing to start seeing everyone’s curiosity unlock. One person would pull another in and say, “Did you see this session?” That level of engagement wouldn’t have existed had we not had a data-driven culture. My team would leave those sessions absolutely dumbfounded by how much engagement they got and how many questions they had. It brought so much joy to my heart.
We would start planning three to four months in advance. I always try to make sure we are in a good position from a data integrity and a data availability standpoint. I never schedule a session just for the sake of having a session. Each year, we looked to see what the business was asking.
We curated each analytics day session to cover the topics the audience wanted to hear. This helped drive the messaging home because we would unpack a general theme or topic that we wanted to cover and really dig into the major insights and what the audience needed to know to do their job better.
You have to understand customer intent. And you will only have this understanding if you know the overall customer journey and can pick it apart. Who’s doing this and why? Are they in the conversion phase, the discovery phase, or are just in there to manage their account?
I previously worked for an auto parts retailer. We had an ecommerce site as well as physical stores. Our leadership team had more traditional ecommerce backgrounds, so they were used to seeing metrics from a digital perspective. We noticed that bounce rates on our website were high, so we decided to investigate.
The first step was understanding how users were entering the site. We saw that a third of sessions associated with high bounce rates were coming into our locations page — customers were visiting the website to locate the closest store. This resulted in low time on site and a high bounce rate, but users got the information they needed and then went to a store. That’s an example of a journey that made sense once we uncovered the reasons behind the metrics.
This goes back to understanding, one, customer intent, and two, how your site is architected.
In terms of customer intent, you have to think hard about what the customer is trying to do. For example, in a previous job, the website was designed with a product grid page. There were multiple products on the page, and the user could compare them, decide what they wanted, and then add them directly to the cart from there. We didn’t require users to take the additional step of visiting the product detail page because it often wasn’t necessary in this context.
As you think about the site architecture, consider whether you can use internal search terms to filter the subset of products more accurately without having the user manually enter the filter. You’ll need to evaluate the queries and also consider how smart your search platform can be.
The balance lies in reading between the lines and figuring out what’s going on. If you’re trying to shorten the time it takes a customer to add an item to the cart, you have to make that option available to them as soon as possible. If you are unsure, you may want to do an A/B test to see what works best. Your findings will help you decide how to optimize a page based on customer behavior. But you’re not making that call — you’re letting the customer tell you the best way to find that balance.
This goes back to being able to have a data-driven team that will make decisions only after understanding their impact. You should always have a member of the analytics team be your gut check. Sometimes, they’ll come up with the most conservative answer, and the business may not like to hear it, but that’s fine. That’s where it becomes a little more democratic — listening to different points of view and considering the overall group’s opinion.
There have been times when we did not reach statistical significance. There are testing tools for physical retailers like APT that estimate when you will reach statistical significance based on the data the tool has ingested. This will give an approximation of a general timeline. On the digital side, many A/B testing and optimization tools do not offer that. There can also be a lot of seasonality depending on the industry, so it’s hard to lock down an estimation of when you’ll reach statistical significance because so many other components tie into that.
So, we don’t always have the luxury of understanding how much time we will have. Sometimes it’s a negotiation and there are points of contention. At times, we’ve had to say, “We absolutely should not go forward with this one because it is too risky.” At other times, we’ve had to make a call and say, “Yes, this is risky but we’re going to go forward anyway because the potential payoff is big.” Again, this goes back to having a group that will speak to site experience and decide to move forward. That’s why you must have some experience, background, and the right people in the room.
There are four main tools that I use and recommend to get a better feel of how you’re going to optimize that experience. The first is session replay tools. These are great for trying to understand how customers are engaging. Some check end-to-end journeys and can see, for example, the pain points from the moment a user lands on the site. Some of the site experience may be due to something other than design; it could be due to a backend issue.
For example, I recently saw an issue where customers couldn’t add items to their carts, but it had nothing to do with us — the commerce platform had a bug. Session replay tools are beneficial for highlighting some of that.
The second tool is customer feedback. How are we collecting customer insights from what they proactively provide to us? It could be product reviews, ratings, surveys, etc. How can we use that in combination with the data we already have to gather insights? Hopefully, it’s complementary. Then, there’s data that can come from post-transaction surveys. How was the whole experience? At NOBULL, I think we do a really good job at this — we make sure to ask at the right moment after the purchase and ask, “How did you find us?” We go into a couple of further questions into how their experience was on the site and if they encountered any hiccups.
Lastly, we can look into site usage data to see if we can provide some guidance to users. If a customer has been staying on a page for X amount of time, maybe it’s because there’s an issue. Maybe we then provide a pop-up chat to ask if everything is going smoothly. This pop-up chat is an example of a site intercept survey tool, which is very helpful for collecting timely and targeted feedback from users while they’re actively engaged with the website.
For all of these factors, you have to find a way to collect and organize that data in a digestible way. You’re not just looking at these disparate sources — there has to be a way to aggregate the data and translate it into actionable insights.
It can be tricky. For multichannel retailers, sometimes the intent is to understand the dynamic. If you’re in leadership, you want to ensure that there’s no cannibalization between your channels and that they’re working seamlessly together. To have an omnichannel mindset, you must remember that both online and in-store have a purpose. Especially in our case for automotive parts, people often need to go in and talk to a specialist rather than try to figure it out online.
Another example is helping people select the right product tier. Online, you can provide that information using a comparison chart. You can have the good, better, and best options and the features and benefits for each product. If the customer is willing to use the comparison charts, then you’ve replicated an in-person experience and helped the customer make a decision.
Our in-store sales force can provide customers with extra information that the site sometimes doesn’t have. They might share personal experiences or what they know from other customers they’ve talked to. You can’t replicate that on a website. So, you’ll often see different types of behaviors from customers when looking at digital and offline engagements.
Sure — this case was a multi-group strategy. One aspect was removing the interim search page. From an analytics standpoint, it was great — having an interim search page with the categories allowed us to see how search terms were associated with the click-through of certain categories.
We could say, “This category was clicked through X amount of times when this particular search term was used.” It allowed us to go into the internal search data and develop some advanced models that gave us information on buying propensity. But it wasn’t the best thing to do from a customer experience standpoint.
Initially, it wasn’t aggregating a whole lot. We weren’t seeing big shifts in conversion rate. In fact, we saw the opposite — it declined slightly over time. But it is important to recognize that there can be a learning curve for customers. You may roll out an enhanced experience, but it may feel unfamiliar to customers who are familiar with the prior version of your website. That’s a slippery slope.
For me and many others, DEI picked up a lot during COVID. There was this feeling of making sure that voices were heard. There was the participation part, but did introverted people have an opportunity to be heard? Could they get their opinions out there, especially when we were remote and working in silos?
I was asked to lead inclusive efforts in my specific unit, a Hispanic organization for leadership. We also focused on some of our customers and made sure that they had their voices out there. The customer’s voice translated into our marketing, our offerings, and everything we did. That was one piece of DEI.
For me, DEI is about understanding who the customer is. Everybody has to have a voice at some point, and they’re using that voice, whether in how they navigate or based on the feedback they voluntarily give you. It’s up to us to interpret that and get it.
If I only consider my own experiences and the things I’m shopping for, I’m mirroring only a portion of my potential audience. I’m not going to be designing for them, but for myself, and in that space, the design won’t necessarily make sense for everybody.
You need to research industries outside of yours to gain that perspective. DEI today means trying to understand all voices—even those that you would probably find unrelated to what you do today.
LogRocket identifies friction points in the user experience so you can make informed decisions about product and design changes that must happen to hit your goals.
With LogRocket, you can understand the scope of the issues affecting your product and prioritize the changes that need to be made. LogRocket simplifies workflows by allowing Engineering, Product, UX, and Design teams to work from the same data as you, eliminating any confusion about what needs to be done.
Get your teams on the same page — try LogRocket today.
Want to get sent new PM Leadership Spotlights when they come out?
To help demystify stakeholder management, you can use tools that introduce a structured approach for your product team.
Role-definition frameworks like RACI, RAPID, and RASIC take a structured approach to assigning roles and clarifying accountability.
Jann Curtis talks about understanding your audience’s purpose and what they hope to get from the conversation.
Scaled agile is an approach that allows you to extend agile principles across multiple teams, projects, or business units.