While working at a Fortune 100 company for nine years before moving to lead my current team, I became fascinated by customer behavior. What kinds of digital offerings most deeply engage customers in their digital lives? I started by looking at some case studies of the products, services, communications and experiences that had been embraced and adopted by customers during the first two decades of the internet. Over a period of seven years working on inbound marketing campaigns, what I found was a recurring pattern of three behaviors that drove the adoption of new digital experiences, which I call the three core behaviors of a network:
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
UC Davis, one of the nation’s top-ranked research universities, is a global leader in agriculture, veterinary medicine, sustainability, environmental and biological sciences, and technology. With four colleges and six professional schools, UC Davis and its students and alumni are known for their academic excellence, meaningful public service and profound international impact.

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Simply put, search engine optimization (SEO) is the process of optimizing the content, technical set-up, and reach of your website so that your pages appear at the top of a search engine result for a specific set of keyword terms. Ultimately, the goal is to attract visitors to your website when they search for products, services, or information related to your business.
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
There are lots of ways you can optimize your digital marketing assets for mobile users, and when implementing any digital marketing strategy, it's hugely important to consider how the experience will translate on mobile devices. By ensuring this is always front-of-mind, you'll be creating digital experiences that work for your audience, and consequently achieve the results you're hoping for.
Now imagine you had that brochure on your website instead. You can measure exactly how many people viewed the page where it's hosted, and you can collect the contact details of those who download it by using forms. Not only can you measure how many people are engaging with your content, but you're also generating qualified leads when people download it.
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
You know who and where your best customers are — Microsoft Advertising lets you choose when and how to reach them. Control where your ads appear by city, state, country and worldwide. Fine-tune your targeting even further by setting the time of day to display your ads and on which devices. By targeting only your most relevant customers, you can reduce unnecessary spending.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

The digital marketer usually focuses on a different key performance indicator (KPI) for each channel so they can properly measure the company's performance across each one. A digital marketer who's in charge of SEO, for example, measures their website's "organic traffic" -- of that traffic coming from website visitors who found a page of the business's website via a Google search.
×