If you want to go a step further and see which traffic sources lead to form completions (like newsletter or app signups), I made a utility that captures UTM parameters and then inserts them into any form submitted during that session: https://github.com/gkogan/sup-save-url-parameters
Kind of related, I saw a webinar by Ontraport recently and they were going a step further by saving google analytics IDs against contacts for deep funnel conversion tracking. The idea is they don't really care about who signs up for a free trial, they want to know who upgrades to paid within the trial period and usually the attribution is all messed up by then or they don't know who is who. So saving details and manually calling google will help to track this.
It seems there are still a lot of little tricks out there that aren't standard practice.
(Disclosure: I created this to help my customers at https://www.terminusapp.com)
If I were to do localStorage I would only do it if the first touch and last touch tags (even if empty) were saved as separate values. That just means slightly more coding which I haven’t gotten around to. Pull requests welcome :)
In the last five years it hasn't come up at all as everyone is doing it and people seem to dig through their analytics less than ever before, but if it helps you track links to your own content, as being shown in the article, it's certainly worth a go.
Another amusing point is that HN didn't used to strip these parameters, so sometimes I could see when people had reposted things from our newsletters on to HN (and kept the utm params in) which was always a buzz :-)
Mailchimp adds their own trackers on top of the links you insert into the newsletter, which they use for tracking open/click rates, but that's almost less interesting than the source.
Why does it matter that I have a 36% click rate on the last issue, is it good? What if instead I have 10% click rate, but 1 of those people shared it with 10k more?
I understand why in general people don't like tracking, especially here on Hacker News, but I think some of it could lead to much better outcomes for all parties involved. I wonder what other solutions there are to help piece together your audience, and their interests.
First, we had some issues with certain sites not working at all if we added them and since our link forwarder was doing it automatically, it made life difficult. We came up with a way to turn them off on an adhoc basis but it was annoying.
Second, I didn't feel they were really moving the needle in any useful way. I don't think people are looking at their stats every day like they used to.
Third, it just felt like more litter/junk for tracking purposes. While there's no privacy aspect to it, I just felt like going a little cleaner in this regard. But.. I can't say they won't ever come back :-)
I have never observed a “utm” query param actually improve the quality of the response.
We all know why this obviously positive functionality isn’t built into the browser: because browser vendors rely on hostile business practices to survive. Still no technology to transact with the site you’re visiting for their content....
You have it backwards. It's more tracking for the webmaster/marketer to know what was popular and what wasn't.
Might even be good, as a way to calculate how well your internal links are working in keeping the users within the property.
For example, put a link at the end of the blog post, and then track how many users clicked it, vs. how many exited on that page.
In any case, I agree and these workarounds are never worth the data doubt they introduce. There are ways to track internal flows without utm tags.
https://chrome.google.com/webstore/detail/neat-url/jchobbjgi...
This garbage breaks the UX of the internet.
You just rarely ever see it because marketing execution is already an operational nightmare at most places (even more so at scale), and the centralized coordination required to leverage the campaign key method introduces a lot of friction, overhead, and potential bottlenecks into processes with little perceived gain. The decentralized model (i.e. entire dataset in the url) is far easier to enforce compliance with between all parties, as you can providing tooling (i.e. a link builder spreadsheet) that allows each party to integrate the usage into their workflows with minimal overhead or ongoing coordination required.
[1] https://support.google.com/analytics/answer/6066741?hl=en
Right now I can add whatever UTM parameters to any links I send without keying it into a database system, submitting a CR to the IT team, submitting it to Google, or any other nonsense. Details will be collected immediately whether through web logs, google analytics, or something else.
If one email that I send has links to multiple domains (some of which I don't control due to SaaS eating the web) then we'd either have to track and manage multiple IDs which change for each link or use a scary looking GUID generated by a centralised system (no thanks). That sounds like a real headache.
Hiding it behind an ID would make it prettier but it also reduces transparency to the end user of what is being collected. I think if it's going to be there anyway, I'd rather see it.
The url is not the UX anymore.
This is why building "the obvious next steps" [1] is such an important concept. This is also why apps like Facebook or Pinterest have been so successful. Within this new UX, users are not looking to leave, they just want to find more information that is relevant to them.
[1] https://www.gkogan.co/blog/ridiculously-obvious-next-step/