00:00:00
AMY DESCHENES: ...That introduction.
00:00:03
So as Sara said, we're here today to talk
to you about our project.
00:00:07
So our talk is where Google Analytics meets
UX, how a UX team implemented GA4.
00:00:12
And we'll start by introducing ourselves.
00:00:14
So I'm Amy.
00:00:15
I use she/her pronouns, and have white skin,
shoulder length brown hair, and wear glasses,
00:00:21
and the head of user experience and digital
accessibility at Harvard Library.
00:00:24
And I have worked here for about eight years.
00:00:27
MEG MCMAHON: I am Meg McMahon.
00:00:29
I use they/them pronouns.
00:00:31
I have dark, brown, and chin length hair,
blue eyes, black-framed glasses, and white
00:00:36
skin.
00:00:37
I'm the UX researcher at Harvard Libraries,
and have worked here for almost exactly a
00:00:41
year now.
00:00:42
AMY DESCHENES: So we're going to share about
our library's project to move to Google Analytics
00:00:48
4, or GA4.
00:00:50
In June of 2022, Google announced that their
old analytics product, which you probably
00:00:55
just called Google Analytics, but the formal
name was Universal Analytics would be retiring
00:01:01
in June of 2023.
00:01:03
So we took this as an opportunity to really
implement a robust analytics strategy while
00:01:09
migrating all of the libraries web products
and websites over to GA4.
00:01:14
And it took us about eight months to do all
of the work we're going to talk about today.
00:01:18
So here's what we're going to go through.
00:01:20
We're going to talk about the migration strategy,
and the goals for the project, and who worked
00:01:26
on the project, and then how we implemented
Google Analytics 4 everything from installing
00:01:32
the code, creating reports, determining how
we're going to set things up, and share information
00:01:38
with stakeholders, and consumers of the data.
00:01:41
So the lessons learned about GA4, and then
more information about how we are continuing
00:01:46
to share out with stakeholders and staff.
00:01:51
So a little bit about strategy before I jump
into the project itself, we use as the UX
00:01:57
team just use web analytics regularly as part
of research studies.
00:02:01
So depending on the research question, or
the goal of the user research, analytics are
00:02:07
sometimes the best way to get an answer about
user behavior, especially if it's an answer
00:02:13
that is best served by quantitative data.
00:02:16
For instance, if someone say someone on the
discovery two delivery committee asks, what
00:02:22
are the most popular filters in HOLLIS?
00:02:24
Which is our library is catalog.
00:02:26
Rather than doing a usability study with only
5 to 10 folks, or a survey.
00:02:32
It's really looking at the analytics is the
best way, and the most accurate way to answer
00:02:37
that question because you're looking at real
data.
00:02:40
So these questions come up all the time, especially
during the beginning phases of our UX projects.
00:02:47
So during the discovery phase.
00:02:49
So having just the analytics service sit within
our group made sense for our organization.
00:02:56
And actually, talking about our organization,
I do want to just share a little bit of context
00:03:01
for our team.
00:03:03
So yes, I am going to show you an org chart.
00:03:06
But I think it's really important just to
understand our approach, and how we work really
00:03:10
across the organization.
00:03:13
So in the org chart, you're going to see--
this is just really like the high level organization
00:03:17
of Harvard Library.
00:03:20
On the left discovery and access, the yellow
box, that is where our team sits, and then
00:03:25
the other units are archives and special collections,
scholarly resources and services, the anti-racism
00:03:31
team, library technology administrative operations,
and strategy communications, and assessment.
00:03:37
And I have filled in library technology and
strategy communications and assessment with
00:03:42
the Crimson color.
00:03:43
Because these are the two groups who were
the main stakeholders and main participants
00:03:47
in this project work.
00:03:48
But we have shared analytics data since then
with all of the other units.
00:03:53
So just to go a little bit deeper if you are
curious, just about our organization a little
00:03:57
bit more.
00:03:59
Our team UX and discovery team are six people.
00:04:03
And then everyone else in our unit sits in
access services technical services, imaging
00:04:07
services, and special projects, which is over
200 staff members.
00:04:12
So now that you have a little context about
our organization, I want to give you a little
00:04:16
context about the websites and the different
web products we support.
00:04:20
So you can see here, we have a list of nine
different discovery systems.
00:04:25
And some of them are grouped together.
00:04:26
And I'm just going to walk you through each
of them.
00:04:29
And these are the numbers you're seeing here
that I will read aloud are the numbers of
00:04:35
sessions for each of our web products from
March of 2023.
00:04:38
So this is very recent data.
00:04:40
It's probably missing today.
00:04:42
And tomorrow, obviously.
00:04:43
But this gives you a good idea of a snapshot
of a month in library web traffic at Harvard.
00:04:50
I will also note that the numbers you are
looking at are in sessions.
00:04:55
And it's important to distinguish sessions
from views, right?
00:04:59
So sessions means that say someone opened
up HOLLIS, and they did a search, and then
00:05:05
they used a filter, and then they logged in,
and then they did another search.
00:05:08
That's all counted as one session.
00:05:10
Page views would be counting like each view
of the different pages they visited would
00:05:15
be a separate count, right?
00:05:17
So page view numbers are usually a lot bigger.
00:05:20
And depending on what kind of information
you're looking for, or what your question
00:05:24
is, it might make sense sometimes to use sessions,
and sometimes to use page views.
00:05:29
If you want to know how many people are looking
at the library websites home page, use page
00:05:33
views.
00:05:34
But if you're thinking about overall traffic,
or the journey someone's taking from say HOLLIS
00:05:39
to a finding in archival collections.
00:05:43
That might be better served by looking at
a session metric.
00:05:46
So I just wanted to go over that difference
between sessions and page views.
00:05:50
So we have up top with the majority of our
traffic is HOLLIS, which is our catalog and
00:05:56
article search with 277,000 sessions.
00:05:59
The main website, Library.Harvard.edu is 163,000
sessions.
00:06:05
Library guides.
00:06:06
And these are library guides from all of the
schools and FAS and the College, 111,000 sessions.
00:06:14
Digital collection sites, this includes our
CURIOSity collections, which are curated sets
00:06:18
of items, as well as Harvard Digital Collections,
which is our digital item search.
00:06:23
39,000 sessions, finding aids, 26,000 sessions,
events and appointments on our calendaring
00:06:29
system, 10,000 sessions, ask a librarian 9,000.
00:06:33
Our image catalog images with 6,000, and our
geospatial data search with 1,000.
00:06:38
So that comes out to just about 642,000 sessions
for the month of March.
00:06:44
So previously, previous to having Google Analytics
4, and setting it up the way we did, it used
00:06:50
to take me probably a couple of days to pull
all this data together.
00:06:55
And now, with our updated strategy, it's really
just we can get it in one view.
00:07:01
So it's a lot easier.
00:07:02
So on the next slide, you will see what our
analytics setup looked like pre 2023.
00:07:09
So prior to this year, each digital product,
so those things that I just listed out, HOLLIS,
00:07:16
the image search, the website, each digital
product had its own instance of Google Analytics,
00:07:21
which was the old product Universal Analytics.
00:07:24
And this made it really difficult for us to
report comprehensive web traffic across all
00:07:29
the library websites, right?
00:07:31
So this announcement about GA4 really gave
us an opportunity to establish a more intentional
00:07:37
strategy for how we are setting up web analytics
and formalizing our approach.
00:07:42
So when we are thinking about this, we really
thought about, what are the future data needs
00:07:48
of the organization?
00:07:49
What are the questions we get asked about
most frequently?
00:07:53
So rather than having individual instances
of Google Analytics, we now have one singular
00:07:58
instance of Google Analytics 4.
00:08:00
And that same code is installed on all of
the different products.
00:08:03
And with some of the new functionality offered
in Google Analytics 4, that Meg will go into
00:08:08
we're able to easily filter just to show HOLLIS
data in one report, or just to show website
00:08:14
data in another report.
00:08:16
So it is a lot more intuitive, and easier
to generate the reports we need most often
00:08:23
with this approach to setting things up.
00:08:27
So another thing we did talk about is the
other analytics products that are out there,
00:08:33
right?
00:08:34
So we some other libraries, especially are
using an open source product called Matomo.
00:08:40
There is also Adobe offers, a paid solution
for web analytics.
00:08:43
So there are other things out there.
00:08:45
But for us, we decided to stick with Google
Analytics because one, it's what other units
00:08:53
at Harvard outside the library uses.
00:08:55
It has really good documentation, and an excellent
user community.
00:08:58
It didn't require any additional technical
setup, other than installing the code initially
00:09:04
on the websites that we are going to be measuring.
00:09:09
And I think the other thing is it's consistent
with what staff are used to working with.
00:09:15
So for us, the benefits with Google Analytics
do outweigh the drawbacks.
00:09:20
But I will say that we are aware of concerns
that people have with Google Analytics.
00:09:25
And if people want to opt out of being tracked
on our sites, they are certainly able to do
00:09:30
so.
00:09:31
We actually have on the privacy page on our
library website, we have as part of our privacy
00:09:38
statement, a section that lets folks know
that we use Google Analytics, and how they
00:09:44
can opt out if they want to.
00:09:46
So it says the Harvard Library uses Google
Analytics to gather statistics for portions
00:09:50
of library websites.
00:09:52
The information gathered will be used to improve
web services for patrons.
00:09:55
Google Analytics uses a browser cookie for
statistical analysis related to your browsing
00:10:00
behavior on these websites.
00:10:01
If you choose, you can opt out by turning
off cookies in the preferences settings in
00:10:06
your browser, or you can download it install
a Google Analytics opt out browser add on.
00:10:11
So that is right there in our privacy statement
on the library's website.
00:10:15
All right.
00:10:17
So that's all the context for now.
00:10:20
Here is the project team, who worked on this.
00:10:23
So in addition to Meg and myself, we had Vanessa
Venti, who was the digital collection services
00:10:29
manager, and Claire O'Keeffe, our editor and
content strategist from Harvard Library Communications.
00:10:34
Both of them have a lot of previous experience
working with Google Analytics, and had regular
00:10:40
reports that they produced for their stakeholders.
00:10:43
So we wanted to make sure that they were going
to continue to get the data they needed, and
00:10:48
also informed thinking about what questions
are you all getting from your stakeholders,
00:10:54
either with digital collections, sites, or
for the library website.
00:10:59
And then we also had Maura Ferrarini, who
is the UX developer, and sits in library technology
00:11:04
services.
00:11:05
She was integral just to getting the code
installed on all of the web products making
00:11:11
sure it was working properly, and setting
up the initial instance in Google Analytics.
00:11:17
So they started as monthly meetings to plan,
and talk about one, what we wanted to learn,
00:11:22
right?
00:11:23
What are the differences between GA4 and Universal
Analytics.
00:11:26
And then we also work together to develop
our project plan and goals, and moved forward
00:11:31
with the implementation and report creation.
00:11:35
So the main goals for the project are pretty
straightforward.
00:11:38
In addition to installing the new Google Analytics
for code on all of the libraries web products,
00:11:43
we wanted to, again, align the strategy for
all the products and make sure current staff
00:11:48
power users of analytics and the stakeholders
are getting the data that they need to answer
00:11:53
the questions that they have.
00:11:56
So we installed Google Analytics on all library
websites, created a unified dashboard for
00:12:01
website analytics, and determined other reports
needed by staff, established a web analytics
00:12:06
service as part of the UX team's responsibilities.
00:12:11
So again, we're using Google Analytics.
00:12:13
Now, we need to make this move before June
of 2023.
00:12:16
And we were successful, which is great.
00:12:18
But we're still in the process of just helping
stakeholders understand the analytics data,
00:12:24
and how it is going to be a little bit different
from what they had previously seen.
00:12:29
So all the data is still there.
00:12:30
But it might look a little different.
00:12:31
It might be named something different.
00:12:33
So we also wanted to make sure that the stakeholders
had the reports that they needed.
00:12:38
Though as Meg will explain, decoding, some
of the technical differences with GA4, and
00:12:43
Universal Analytics, as well as finding the
best reporting solution, but definitely the
00:12:47
main challenge of the project.
00:12:49
So I will pass it to Meg.
00:12:51
MEG MCMAHON: Thank you, Amy.
00:12:54
So I'm going to be talking about the implementation
as described, and explained a few case studies
00:12:59
to illustrate how we choose to use the Google
products.
00:13:03
So first things first, when we talk about
Google Analytics, for us, we actually mean
00:13:09
for different systems.
00:13:11
We use analytics itself.
00:13:12
We use Tag Manager, Looker, and Search Console.
00:13:17
So a little bit about each one.
00:13:19
Analytics is the base.
00:13:20
It's the home of our Google Analytics.
00:13:22
All the information that's tracked is housed
there, and is able to be accessed by anyone
00:13:30
who has access to our analytics property.
00:13:34
Tag Manager is where we're able to create
events that track in display in analytics.
00:13:39
And these can be really specific, or really
broad.
00:13:42
It really depends on the needs of our stakeholders
in what type of events we create.
00:13:48
And then Looker is a Google-branded data visualization
tool that integrates with Google Analytics
00:13:53
along with other online analytics tools.
00:13:56
And then we have Search Console, which is
a tool to help us understand our SEO, and
00:14:02
the search terms that are bringing in users
to the website, and specific pages of the
00:14:07
website.
00:14:08
That Search Console is not across every single
website because sometimes it doesn't make
00:14:13
sense to necessarily have it attached to a
library web property.
00:14:18
But for the most part, the ones that where
it makes sense, we have it implemented, and
00:14:23
ready to use.
00:14:26
So we talked a bit about cross domain tracking
at the beginning.
00:14:29
But before we just dive deeper, I wanted to
touch a little bit more on it.
00:14:34
And Google Analytics 4 users are able to easily
track all websites related to their library.
00:14:41
This means that all the data for each website
is going into a single GA4 view, and it makes
00:14:45
it easier for us to track overall website
usage.
00:14:51
And a big note to get here is that if we had
all those disparate systems like we did before,
00:14:59
we would have uses of duplicative users.
00:15:03
We wouldn't be able actually to see our full
user base as they move through the sites.
00:15:08
We would have a bunch of different numbers,
and we won't be able to tell who was actually
00:15:12
the same user across all those different properties.
00:15:15
But now since it's one, we know that one user
can move from HOLLIS to live guides to a finding
00:15:20
aid if that's their user journey.
00:15:24
With cross domain tracking, GA4 uses browser
cookies to track a single user's journey between
00:15:29
sites.
00:15:30
So once, again, we're able to track those
sessions easier across our sites.
00:15:36
And for those who don't actually have GA4
implemented in this conversation, you can
00:15:42
find more about how to do cross domain tracking
within the data streams admin panel.
00:15:50
Currently, we have this tracking at 10 different
websites under one account.
00:15:57
You saw the list that Amy had for sessions.
00:15:59
Those are all the properties we have tracking
currently.
00:16:04
So the biggest difference between Universal
Analytics and Google Analytics 4 is that Universal
00:16:11
Analytics was based on page views and sessions.
00:16:15
And GA4 is based on events and parameters.
00:16:19
So basically, what that means is it affects
the type of analytics that GA4 is showing
00:16:27
in the overview.
00:16:28
If you've ever seen Universal Analytics, it
would show pageviews first.
00:16:35
And that would be a priority.
00:16:36
Now, in Google Analytics 4, event is a page
view.
00:16:43
They took page views, and made it an event.
00:16:46
They took sessions, and made it an event.
00:16:49
So everything is technically considered an
event now.
00:16:54
One of the biggest things to note about this
is like bounce rate.
00:16:58
A lot of people use bounce rates in the old
Universal Analytics.
00:17:03
And G4 doesn't show bounce rate as front and
center as it did before.
00:17:08
And they decided to instead change that event
of bounce rate to something called engaged
00:17:13
sessions, which once digging into the documentation,
we realize that the percentage of engaged
00:17:19
sessions are just the inverse of bounce rate.
00:17:22
So they're trying to prioritize use over non-use
when it comes to GA4.
00:17:31
So reading through the Google documentation
comparing UA to GA4 often help us learn how
00:17:37
to find analytics that were now deprioritized
in GA4, or change our thinking on what types
00:17:43
of analytics to prioritize, or even new events
that we should care about when it comes to
00:17:50
what we're looking, and interested on GA4.
00:17:55
What's really cool about GA4 is that you can
even create events within analytics itself.
00:18:01
You don't have to actually use Google Tag
Manager to create events if you're only going
00:18:06
to be tracking a couple of events.
00:18:08
For us, it made more sense to use Tag Manager
just because we had so many events we wanted
00:18:14
to track.
00:18:15
But for those who don't have GA4 yet, you
can add up to 50 tracked events just within
00:18:21
analytics itself.
00:18:23
And I'm going to talk a bit about some new
features for GA4 that we are utilizing for
00:18:28
different audiences and purposes right now.
00:18:32
So one of our case studies is the exploration.
00:18:36
So on this slide, there is an example of exploration,
which is located under the explore tab in
00:18:42
GA4.
00:18:43
An exploration is a new feature in GA4, where
you're able to compare sessions, or users
00:18:48
against each other, or you can use it as a
way to easily dig down, and answer questions
00:18:53
that you have to filter the data to answer.
00:18:55
There are a few kinds of explorations.
00:18:58
And this one is known as a segment overlap.
00:19:01
There are others you can find in the explore
tab as well.
00:19:05
So the way that the segment overlap works
is it actually has you create audience segments.
00:19:14
So in this case on the slide, you can see
we have Harvard Library sessions, HOLLIS sessions,
00:19:21
and lib guide sessions.
00:19:23
So what this is doing is showing us the overlap
of which session used both properties or all
00:19:31
three properties in the same session for library
folks, or using our library.
00:19:39
This was a big question we had from upper
library management, and product owners is
00:19:44
what actually is the audience overlap between
our systems.
00:19:48
And using an exploration, we are able to find
that answer.
00:19:51
A point of clarification on this exploration,
it's filtered to our OWEN audience as well.
00:19:57
So that's 18 to 24-year-olds in the Boston
and Cambridge area, or the closest identifier
00:20:03
to our undergraduate population that we can
get given how Google tracks users.
00:20:12
What also is really exciting is you can't
see it in this exploration.
00:20:16
But we're able to actually dig deeper and
within an exploration like this.
00:20:21
And actually see what pages are driving that
change.
00:20:25
So for example, we would be able to know that
on Library.Harvard, someone clicks from the
00:20:30
HOLLIS tool page to go to HOLLIS.
00:20:34
We're actually able to break down what are
those driving pages between the sites that
00:20:39
move people from site-to-site.
00:20:46
Another new feature of GA4 is the ability
to create more customized collections that
00:20:51
you can easily access from the reports tab.
00:20:53
There are two reports that you can use.
00:20:56
An overview report and a detailed report.
00:20:59
Overall for us, we found that detailed reports
are more filterable and easily customizable
00:21:04
than an overview report.
00:21:06
Reports can be on anything you want your stakeholders
to easily have access to.
00:21:10
For example, we've created a collection called
HL Library.
00:21:13
That includes reports for the library website
for our communications team.
00:21:17
Because of our choice to do cross domain tracking,
we do have to filter by hostname for this
00:21:22
report to make sure that it's only showing
that data for Library.Harvard.
00:21:26
But it's very easy to filter by hostname in
GA4 using report.
00:21:31
So it's a non-issue.
00:21:33
It takes you about 5 seconds to add that filter
onto a report.
00:21:38
I will say a downside of the collection of
this report is that currently, they don't
00:21:42
have many visualizations that make it easy
to parse the data.
00:21:47
And when I talk a little bit about Looker,
we'll see how we actually create better visualizations
00:21:52
using the data in Google Analytics 4.
00:21:55
I wanted to talk a bit about Tag Manager.
00:22:00
On the slide, it's a screenshot of the Tag
Manager's tag page.
00:22:04
And Tag Manager has a really specific job
in the Google Analytics like Suite.
00:22:09
And it's to create tags that are tracked as
events in GA4.
00:22:12
We use it to create hostname specific tags
based on our stakeholders questions or website
00:22:17
usage.
00:22:20
To illustrate this, we've worked with the
communications team to create a set of tags
00:22:23
for the library website that they used in
UA.
00:22:28
And we had to migrate to the GA4 instance
of Google Analytics for Tag Manager property.
00:22:37
So we actually use that opportunity to evaluate
the current needs that the communications
00:22:41
team had to find out if they were the same
as four years ago when those initial tags
00:22:46
were created.
00:22:48
We talked about their goals, what they would
do with the data, and that their current communication
00:22:53
goals.
00:22:54
Together, we came up with a new set of events
to tag in our GA4 instance.
00:23:00
And as a note, I want to have a pro tip.
00:23:03
It's really helpful to have naming conventions
and use folders if you're using Tag Manager.
00:23:09
It helps with the filtering that you can do
both in the explorations tab of GA4 and in
00:23:15
Looker studio easily when you have a naming
convention for a website or a page type.
00:23:24
And now on this slide, there's an example
of the data visualization created by Looker.
00:23:30
Looker was previously known as Google Data
Studio.
00:23:33
If you had ever heard that name before, we
use it to create data dashboards for stakeholders,
00:23:39
committees, and product owners.
00:23:41
And Looker, we're able to create extremely
specific and detailed visualizations that
00:23:46
fits their needs.
00:23:47
I will say that we often like to give this
view to stakeholders because it's a little
00:23:51
more digestible than a report itself in analytics
because there is some more visualization opportunities.
00:23:57
Right now, what you're seeing is the view
of our all analytics data dashboard, which
00:24:03
includes all of our library properties put
together.
00:24:07
So this is just an extremely high-level dashboard
as you can see.
00:24:12
But you can see here, we can change at the
very upper right like what dates we're looking
00:24:19
at.
00:24:20
And since we did this presentation in March,
the dates might be a little less fresh than
00:24:25
Amy's that she had for the sessions.
00:24:30
But you get an idea of what that kind of looks
like and what those numbers are for our population.
00:24:37
We also use Flickr to compare specific pieces
of data for all the websites in one place.
00:24:42
For example, this view is all website sessions.
00:24:46
Amy showed that in a different way.
00:24:48
But this is a way that it could look using
look or studio.
00:24:53
It's really easy to confirm suspicions about
your different websites, and website usage
00:25:02
when you have this type of data, and you're
able to filter specifically by hostname.
00:25:06
For example, we've long suspected that our
library catalog was our most trafficked website.
00:25:12
And by comparing it with other properties
in the same view, we're able to see just how
00:25:17
much more there are sessions as opposed to
any other library website.
00:25:22
Another decision we've concluded from a view
like this is that we need to put more resources
00:25:27
to understanding how to promote our other
search systems.
00:25:30
So there's a lot of different things that
you can gain by looking at the websites together
00:25:35
in a view like this.
00:25:38
And I just wanted to touch on a very specific
dashboard that we've created.
00:25:45
We've created this dashboard for D2D, for
folks who are working with metadata on all
00:25:51
this.
00:25:52
And this allows us to actually dig down more
into those events that I was talking about
00:25:58
that was created in a Tag Manager.
00:26:01
We can see on the lower right, the individual
record clicks.
00:26:06
We can actually track how many times someone
clicks, for example, to open a HOLLIS search
00:26:14
result, or something like if they choose to
export-- Oh, gosh.
00:26:21
How am I forgetting the word?
00:26:25
Citation.
00:26:27
Export a citation.
00:26:29
But as you can see, we have a lot of different
things that we're tracking.
00:26:33
We're tracking the filters people are using
in the system.
00:26:36
We're tracking how they sort it.
00:26:38
That's about what this dashboard is for.
00:26:44
And then I wanted to just chat a little bit
about Google Search Console.
00:26:47
Once again, we're able to actually view what
keywords, brings users to our site, and understand
00:26:55
that the SEO and sites standing for specific
pages.
00:26:59
One example that we have that's really high
up is our think tank live guide.
00:27:05
When people search something like Think Tank,
our guides actually really high up in the
00:27:10
results.
00:27:12
And people often, even if they're not Harvard
users, go to our live guide to view that data.
00:27:17
And we're more easily able to tell that from
something like Google Search Console.
00:27:24
For my final case study, I just wanted to
discuss how we use an analytics review as
00:27:30
a UX method for current research objectives
to support something like the discovery of
00:27:35
our special collections.
00:27:37
We have about four different systems that
we use for special collections at Harvard
00:27:41
Libraries.
00:27:42
And we were curious about the patron usage
of those systems.
00:27:45
Prior to this review, we had thought that
our discovery systems for special collections
00:27:49
and archival materials were not as heavily
used as our primary search system HOLLIS.
00:27:57
But something that we learned from actually
doing an analytics review is that those sites
00:28:01
were getting way more traffic than we originally
had thought.
00:28:06
We also started to create events to be tracked
within those discovery systems.
00:28:11
For example, with our HOLLIS for archival
discovery instance, we're tracking which repositories
00:28:19
are used to most often like during session
usage, right?
00:28:24
Because views in a specific finding aid, there
could be a lot, but it could be the same session.
00:28:29
So it's a little more interesting to look
at that session usage.
00:28:33
And also, that helps us to be able to filter
better for repositories who want more detailed
00:28:39
analytics, where they're finding aid usage
as well.
00:28:45
So we're using new events and new tags to
also help with that special collection discovery
00:28:53
piece here.
00:28:54
Both the events and session data is going
to help us inform strategy for special collection
00:29:00
discovery over the next three years and promotion
of those discovery systems.
00:29:05
Our stakeholders can now feel more confident
in the decisions they are making with our
00:29:10
discovery systems because we are better able
to see their usage and connections between
00:29:15
their system and other systems.
00:29:18
I wanted to touch briefly on lessons learned
during this process.
00:29:22
Because it wasn't all smooth sailing.
00:29:24
There are a lot of little bugs that we had
to sort out as it is when you're implementing
00:29:28
anything new.
00:29:31
One of the biggest things is with all vendor
products.
00:29:33
You're not actually often able to fully predict
what changes are going to make, and how that's
00:29:38
going to affect you, and how you would set
up your current practices.
00:29:42
A primary example is when Data Studio switched
to Looker, in the switch, Looker decided to
00:29:47
enforce API quotas.
00:29:49
So basically that means the act of pulling
data from GA4 to a dashboard in Looker.
00:29:56
This made it really hard, at the beginning,
to access a lot of data for our dashboards.
00:30:05
And as you can see on this image here, there
was a data quota error.
00:30:10
Since the internet, and GA4 community rose
up against this, they've definitely are a
00:30:19
little more lax with this API quota issue.
00:30:22
But we needed to adapt how we were sharing
these data dashboards, and how we were creating
00:30:31
them to make it a little bit better to not
hit a data quota as easily during this process.
00:30:38
Another thing that I wanted to touch is that
in the reports and collections on GA4, you're
00:30:44
unable to filter data as specifically as you
could in Looker or explorations.
00:30:50
For us, this really helped us decide that
our primary output for stakeholders would
00:30:54
actually be more of an exploration or a Looker
dashboard opposed to creating a bunch of separate
00:31:03
views within just the reports tab of Google
Analytics.
00:31:06
But that was something we definitely had to
contend with, and think about when it comes
00:31:11
to how we share the data that exists there.
00:31:14
And then another thing I wanted to say is
that meeting facilitation is key.
00:31:20
We know that a lot of folks are really excited
about this change.
00:31:24
And we've found it best to really talk with
folks about what their goals are.
00:31:28
Because it's really easy to fall into the
trap, I would say, of collecting data for
00:31:34
data's sake.
00:31:37
And the thing is with using a free tool like
Google Analytics, there are some, I would
00:31:42
say, limits to how many events we could create
for Tag Manager.
00:31:47
I actually recently come up to-- I came to
a problem when I was trying to do something
00:31:51
with live guides, where I created too many
filters, and I had to delete all my work.
00:31:57
Because we decided it wasn't worth having
all these filters for the live guides piece.
00:32:03
But talking with folks and understanding the
goal can help us better create tags and reports
00:32:12
that actually help work get done and decisions
be made for the future.
00:32:18
And I'm going to pass it back over to Amy
to talk a bit about sharing our work.
00:32:22
AMY DESCHENES: Awesome.
00:32:24
Thanks, Meg.
00:32:26
I will add that my biggest lesson learned
from all this work is that Google Analytics
00:32:32
4 is not like Universal Analytics.
00:32:36
And it's really important to keep telling
people that.
00:32:39
Especially if folks have had the experience
of in virtual analytics, where you go in,
00:32:45
and click on things, and go down rabbit holes,
and explore.
00:32:49
And I feel like Google intentionally changed
the way they do analytics to cut down on that
00:32:57
possibility of behavior, where the best thing
to do is go in with a question, right?
00:33:04
It really forces you to have that question
ahead of going in and poking around, right?
00:33:09
And I think you are able, especially in explorations,
to still do that kind of very specific type
00:33:16
of exploration you want to do.
00:33:20
But it is a little bit more effort is required,
I would say, of the person who is doing the
00:33:25
exploration.
00:33:26
So I would say for folks on your teams, who
are interested in Universal Analytics, definitely
00:33:33
encourage people to do the trainings that
are out there to watch a couple of tutorials,
00:33:37
so they understand and they're not like, hey,
I always looked at bounce rate.
00:33:41
It was right there.
00:33:42
Where to go?
00:33:43
So you need to introduce a really-- isn't
about bounce rate anymore.
00:33:47
It's more about the opposite, which is the
engagement rate.
00:33:50
So I think just encouraging folks reminding
people that it is really different, I honestly
00:33:57
wish they had redesigned it as well because
it still looks very similar to old Google
00:34:02
Analytics.
00:34:03
But the approach they take, the interactions
in the reporting, and just in the ability
00:34:09
to browse in Google Analytics, it's very different
than it was with Universal Analytics.
00:34:15
All right.
00:34:18
So the last piece I wanted to share out is
how we are sharing all of this work, and letting
00:34:23
folks know that they are able to ask us questions,
we're able to pull reports.
00:34:29
We've been to a couple of different library
committee meetings just sharing for the different
00:34:33
product teams, what information we can share.
00:34:38
We did an all staff email just explaining
that this service is available, and we can
00:34:43
ask you to one off questions, or that it can
be part of a bigger research study.
00:34:47
When we did a presentation to different members
of the leadership, those folks who are in
00:34:52
leadership, especially for technology and
for communications, then we've been doing
00:34:57
other presentations as well.
00:34:59
And obviously, sharing here today.
00:35:01
And I have seen a couple of questions come
in the chat.
00:35:04
But we are very happy to stay, and talk to
you more about things we learned, and answer
00:35:10
your questions.
00:35:11
MEG MCMAHON: Thank you so much.
00:35:14
I'm going to stop.