Articles

Google resembles an iceberg: there’s the part above the water we can see and use everyday; there’s also the part beneath the water, that we don’t see and know little about.

While many of us are concerned about the aspects of Google we don’t see — the parts that threaten our privacy, or monopolize the web — there’s no denying that Google offers some amazing products and tools, many of them free, all from the convenience of a single login.

Today we’re going to take a look at 12 tools from Google that really do bring something positive to the table.

1. Polymer

Polymer is an open-source JavaScript library from Google for building web applications using Web Components. The platform comes with a ton of libraries and tools to help designers and developers unlock the web’s potential by taking advantage of features like HTTP/2, Web Components, and Service Workers. 

The main feature of Polymer is Web Components. With Web Components, you can share custom elements to any site, work seamlessly with any browser’s built-in elements, and effectively use frameworks of all kinds. Products like LitElement (a simple base class for creating fast, lightweight web components) and PWA Starter Kit make Polymer easy to use. If you like, you can build your app entirely out of Web Components.

2. Lighthouse

Google Lighthouse is an open-source, automated tool for improving the quality of web pages. The software allows you to audit web pages for performance, SEO, accessibility, and more. You can run Lighthouse using ChromeDevTools, directly from the command line, or as a Node module. 

To use Lighthouse in Google Chrome, just go to the URL you want to audit (you can audit any URL on the web), open ChromeDevTools, and click the Audits tab. After you have run the audit, Lighthouse will give you an in-depth report on the web page. 

With these reports, you will see which parts of your web page you need to optimize. Each report has a reference doc that explains why that audit is important and also shows you the steps you can take to fix it. 

You can also use Lighthouse CL to prevent regression on your sites. Using Lighthouse Viewer, you can view and share reports online. You can also share reports as JSON or GitHub Gists. 

Lighthouse also comes with a feature called Stack Packs that allows Lighthouse to detect what platform a site is built on. It also displays specific stack-based recommendations.

3. Google Analytics

Google Analytics is the gold standard of analytics services. Google analytics can be installed on your site for free with a small amount of JavaScript and allows you to see all kinds of details about your site visitors, like what browser they’re using, and where they’re from.

By using Google Analytics you can make decisions about your site based on science, and therefore be somewhat confident that the decisions you make will result in the outcome you are expecting.

4. Flutter

Flutter is Google’s UI toolkit for building natively compiled applications for mobile, web, and desktop from a single codebase. The toolkit is open source and free to use. The best part of Flutter is that it works with existing code. 

The toolkit has a layered architecture that allows for full customization, which results in fast rendering and flexible designs. It also comes with fully-customizable widgets that allow you to build native interfaces in minutes. With these widgets, you will be able to add platform features such as scrolling, navigation, icons, and fonts to provide a full native performance on both iOS and Android.

Flutter also has a feature called hot reload that allows you to easily build UIs, add new features, and fix bugs faster. You can also compile Flutter code to native ARM machine code using Dart native compilers. 

5. Google API Explorer

Google has a huge library of APIs that are available to developers but finding these APIs can be difficult. Google API Explorer makes it easy for developers to locate any API. On the Google API Explorer web page, you will see a complete list of the entire API library. You can easily scroll through the list or use the search box to filter through the API list. 

The best part of Google API Explorer is that each link to a reference page comes with more details on how to use the API. API Explorer is an excellent way to try out methods in the Monitoring API without having to write any code.

6. Puppeteer

Puppeteer is a project from the Google Chrome team. The platform enables web developers to control a Chrome (or any other Chrome DevTools Protocol based browser) and execute common actions, much like in a real browser. Puppeteer is also a Node library and it provides a high-level API for working with headless Chrome. It is also a useful tool for scraping, testing, and automating web pages. 

Here are some things you can do with Puppeteer: generate screenshots and PDFs of pages, UI testing, test Chrome Extensions, automate form submission, generate pre-rendered content, and crawl Single-Page Applications. 

7. Codelabs

Google Developer Codelabs is a handy tool for beginner developers and even advanced developers who want to improve their knowledge. Codelabs provide a guided, tutorial, hands-on coding experience. Codelabs’ site is broken down into several tutorial sessions on different topics. 

With the tutorials on Codelabs, you can learn how to build applications from scratch. Some of the tutorial categories include Augmented reality, TensorFlow, Analytics, Virtual Analytics, G Suite, Search, Google Compute Engine, and Google APIs on iOS. 

8. Color Tool

Color Tool makes it easy for web designers to create, share, and apply colors to their UI. It also measures the accessibility level for any color combination before exporting to the palette. The tool comes with 6 user interfaces and offers over 250 colors to choose from. 

The tool is also very easy to use. All you need to do is pick a color and apply it to the primary color scheme; switch to the secondary color scheme, and pick another color. You can also switch to Custom to pick your own colors. After you have selected all your colors, use the Accessibility feature to check if all is good before exporting it to your palette. 

9. Workbox

Workbox is a set of JavaScript libraries and Node modules. The JavaScript libraries make it easy to add offline support to web apps. The Node modules make it easy to cache assets and offer other features to help users build Progressive Web Apps. Some of these features include pre-caching, runtime caching, request routing, background sync, debugging, and greater flexibility than sw-precache and sw-toolbox. 

With Workbox, you can add a quick rule that enables you to cache Google fonts, images, JavaScript, and CSS files. Caching these files will make your web page to run faster and also consume less storage. You can also pre-cache your files in your web app using their CLI, Node module, or webpack plugin. 

10. PageSpeed Insights

PageSpeed Insights is a handy tool from Google Developers that analyzes the content of a web page, then generates suggestions on how to make the page faster. It gives reports on the performance of a web page on both desktop and mobile devices. At the top of the report, PageSpeed Insights provides a score that summarizes the page’s performance. 

11. AMP on Google

AMP pages load faster and also look better than standard HTML pages on mobile devices. AMP on Google allows you to enhance your AMP pages across Google. It is a web component framework that allows you to create user-first websites, ads, emails, and stories. One benefit of AMP is that it allows your web pages to load almost instantly across all devices and platforms hence improving the user’s experience. 

12. Window Resizer

When creating websites, it is important that developers test them for responsive design – this is where Window Resizer comes in. Window Resizer is a Chrome extension that resizes the browser window so that you can test your responsive design on different screen resolutions. The common screen sizes offered are desktop, laptop, and mobile, but you can also add custom screen sizes. 

 

Featured image via Unsplash.

Source


Source de l’article sur Webdesignerdepot

As a web designer, you’re constantly being bombarded with messages that tell you to acquire new skills, try new tools, and keep on hustling.

But if you’re constantly changing things up, does it do the opposite of what you originally set out to do? In other words, if you always have to start over, is it possible to ever really achieve anything?

I think it ultimately depends on why you’re making the change.

When Change Is the Right Move for Web Designers

One of the reasons I despise New Year’s resolutions is because it’s change for the sake of change:

It’s a new year, so it’s time to get all hyped up about this one thing I need to change about myself!

There’s a reason why so many resolutions fail by February. When you force a change, it’s really hard to stay invested in it, especially if it’s something you’ve chosen to do because everyone else has.

Change should be driven by necessity.

That said, when it comes time to make changes as a web designer, is it ever really necessary? Or are you learning new skills, trying new tools, or switching up your client list simply because it’s what you believe you have to do?

It’s important to be open to change, but you should only invest your time, money, or effort when it’s the absolute right move for you. Here are some ways you’ll know when that’s the case:

Learn New Skills To…

…Round Out the Basics

If you’re a new designer and there are gaps in your education and training (and I don’t mean formally, just in general), then there’s no reason to hesitate in spending time to acquire those skills.

This doesn’t just go for basic skills as a web designer or as a coder. This also goes for skills you need to become a successful freelancer.

…Add Evergreen Skills to Future-Proof Your Position

As you move up in your career, you’ll eventually find other skills worth learning. Just make sure they’ll help you move the needle.

The best way to do that is to focus on acquiring evergreen skills that’ll always be useful to you, no matter what stage you’re at in your career or how the design landscape changes. They should also go beyond the average skill set of a designer, so they help you stand out further from the pack.

… Create a Better Situation for Yourself

The web is constantly evolving, which means that your responsibilities and skills as a web designer will have to change in order to adapt. Whenever one of these shake-ups occurs, you should either be ready to master the needed skill right away or, better yet, have been working on it beforehand.

Take, Google’s mobile-first indexing, for instance. It announced it was going to be making this shift years before website rankings were impacted. Designers had plenty of time to not only learn what was needed to design for the mobile-first web, but to get all their existing clients’ sites in shape for it.

Adopt New Tools When…

…Your Existing Ones Are Slowing You Down

If you’re doing a lot of things from-scratch (like writing emails to clients or creating contracts), that’s a good sign your toolbox needs some improvement.

As a web designer, you should be focused on creating, not on the tedious details involved in running a business or communicating with clients. That’s just not a good use of your time. A lot of this stuff can easily be automated with tools and templates.

…You’re Turning Down Business

In some cases, it’s the right thing to say “no” to prospective clients — like when they’re a bad fit or can’t afford your rates. However, there are other times when you desperately want to be able to say “yes”, but you don’t have the capacity for the job or you’re unable to cover the full scope of what they need.

This is where new tools come in handy. For instance, let’s say you’ve been approached by a ecommerce company that not only wants you to build a new store, but also needs it fully optimized for search (it’s not the first time this has happened either). Rather than turn something like that down, you may find that the addition of an SEO tool to your toolbox is all you need to be able to say “yes”.

…You Have Extra Room in Your Budget

Obviously, you don’t want to throw away money on a bunch of tools simply because a ton of people are talking about them. But you’ll eventually get to a point where the tools that served you well in the first year of business need to be replaced.

If you get to a point where you have extra time to experiment and there’s room in your budget for upgraded tools, go ahead and assess what you currently have and test out replacement solutions that will help you work better, faster, and smarter.

Look for New Business Opportunities If…

…You’re Not Doing Well

“Well” here is subjective. For instance:

  • If you’re not doing well financially, you probably need to look for more clients;
  • If you’re not doing well in terms of how you get along with clients, you should explore a niche that’s a better fit;
  • If you’re not happy with your job because burnout and stress have overtaken your life, then you might consider exploring other avenues of work.

When something has been amiss for awhile, the last thing you should do is lean into it and hope it gets better.

…The Web is Changing

Notice a trend here? Each of these changes (skills, tools, and now business opportunities) is often driven by the fact that the web is always changing. And as the web changes, you have to be ready to evolve.

In terms of business opportunities, what you’ll realistically need to do is look for new kinds of design work as technologies make your job obsolete. Take website builders like Wix or Shopify, for example. As business owners and entrepreneurs take it upon themselves to build their own websites, more and more web designers will need to find other kinds of clients and jobs to take on.

…You Want to Diversify Your Income

This is something many web designers are doing already as they’ve discovered how beneficial it is to have predictable recurring revenue streams.

But even if you’ve already found one way to diversify and stabilize your income (like by offering website maintenance services), you may become interested in exploring other opportunities along the way. If you have the capacity to pursue them, then go for it.

Is Change a Good Idea?

As you can see, change can be a very good thing for a web designer, their business, and their clients. However, there should be a very good reason for the change and you need to prepare yourself for how it’s going to impact what you’re doing now before implementing it. No amount of change can happen without some level of sacrifice.

 

Featured image via Unsplash.

Source


Source de l’article sur Webdesignerdepot

Contentful; Webster’s Dictionary defines “contentful” as… not found. Clearly someone made up this word, but that is not necessarily a bad thing.

The world of user experience metrics is moving quickly, so new terminology is needed. Largest Contentful Paint (LCP) is one of a number of metrics measuring the render time of content on a web page.

What is Largest Contentful Paint?

Google defines LCP as “the render time of the largest content element visible within the viewport.” For what we are talking about in this blog, we will consider “content” to be an image, typically a JPEG or PNG file. In most cases, “largest” points to a hero image that is “above the fold” and is one of the first images people will notice when loading the page. Applying optimization to this largest content is critical to improving LCP.

It is probably more instructive to view LCP relative to other metrics. For example, First Contentful Paint (FCP) and Visually Complete book end LCP.

Each metric has its pros and cons, but LCP is a happy medium. LCP marks when web page loading starts to have a substantial impact on user experience.

In Google’s opinion, to provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading. Poor values are anything greater than 4 seconds.

How Does Largest Contentful Paint Impact Lighthouse Scores and SEO?

LCP is now part of several “Core Web Vitals” scores that Google will measure in its ranking algorithm. Each of the Core Web Vitals represents a distinct facet of the user experience, is measurable in the field, and reflects the real-world experience of a critical user-centric outcome.

In the case of the overall Google Lighthouse score, LCP represents 25% weighting on the performance score of Lighthouse version 6.0. This makes LCP the most important Core Web Vitals metric in determining the performance score.

While Google has indicated that content is still the most important factor in SEO ranking, a better user experience (as measured by Core Web Vitals) will generate higher rankings in a crowded field. If there are many websites competing for the top search engine spots, then Largest Contentful Paint will play a critical factor in rankings.

How to Improve Largest Contentful Paint

Now that you know that LCP is important, what can you do to improve it by making content load faster? Google provides a number of suggestions, but the most effective technique is to optimize content for the device requesting it.

For example, a website includes an 800kb JPEG image that is intended for high resolution desktops. On a smartphone, that would be optimized down to less than 100kb, with no perceptible impact on quality. LCP can improve by more than 60% — or several seconds — through this single optimization.

Find Savings in Largest Contentful Paint by using Image Speed Test

Image Speed Test is a great tool offered by ImageEngine.io that provides an analysis of LCP improvement opportunities. Just paste in the URL of the web page you are interested in optimizing, and the test will show you:

  • Image Payload Reduction
  • Speed Index
  • Largest Contentful Paint
  • Page Load Time (Visually Complete)

It also provides a video of the web page loading with and without optimizations. Finally, it analyses each image to provide an estimate of payload savings. In this case, the “largest content” on the page is this image. With optimizations, the image payload is reduced by 94%. That delivers a huge improvement in LCP.

How Does ImageEngine Improve LCP

ImageEngine is an image content delivery network (CDN) service that makes image optimization simple. Basically, for each image on the page, the image CDN will:

  1. Detect the device model requesting the web page;
  2. Optimize the image in terms of size, compression, image format;
  3. Deliver via a CDN edge server that is geographically closest to the user.

ImageEngine improves web performance for every image on the page, including the largest. You can learn more about ImageEngine here, and also sign up for a free trial.

Best Practices: Preconnect

In addition to using an image CDN like ImageEngine, a few other best practices can improve LCP. Using the resource hints to provide a preconnect for your content can streamline the download process.

For example, putting the following link statement in the HTML will accelerate the download process. The link statement will make the browser connect to the third party as early as possible so that download can start sooner. ImageEngine’s optimizations make each image download smaller and faster, but preconnect save time in the connection phase.

Best Practices: Minimize Blocking JavaScript and CSS

When JavaScript or CSS is “blocking” it means that the browser needs to parse and execute CSS and JavaScript in order to paint the final state of the page in the viewport.

Any website today relies heavily on both JavaScript and CSS, which means that it is almost impossible to avoid some render blocking resources. On a general note: be careful with what kind of CSS and JavaScript is referenced inside the <head> element. Make sure that only the strictly necessary resources are loaded in <head>. The rest can be deferred or loaded asynchronously.

When looking to improve the LCP specifically, there are some practices worth looking into more deeply.

Inline Critical CSS

It is not an easy task, but if the browser can avoid making a request to get the CSS needed to render the critical part of the page – usually the “above the fold” part – the LCP is likely to occur earlier. Also you will avoid content shifting around and maybe even a Flash of Unstyled Content (FOUC).

The critical CSS — the CSS needed by the browser to set up the structure and important styles of the part of the page shown above the fold — should in-inlined. This inlined CSS may also refer to background images, which of course should also be served by an Image CDN.

Do Not Use JavaScript to (lazy) Load Images

Many modern browsers natively support lazy loading, without the use of JavaScript. Because images usually are heavily involved in the performance of LCP, it is best practice to leave image loading to the browser and avoid adding JavaScript in order to lazy load images.

Lazy loading driven by JavaScript will add additional latency if the browser first has to load and parse JavaScript, then wait for it to execute, and then render images. This practice will also break the pre-parser in the browser.

If an image CDN is used to optimize images, then the benefits of lazy loading become much smaller. Especially large hero images that are above the fold have a large impact on LCP and will not benefit from being lazy loaded with JavaScript. It is best not to make JavaScript a blocking issue for rendering images, but rather rely on the browser’s own ability to select which images should be lazy loaded.

 

[– This is a sponsored post on behalf of ImageEngine –]

Source


Source de l’article sur Webdesignerdepot

We are gathered here today….

Today I write in memory of Adobe Flash (née Macromedia), something that a bunch of people are actually too young to remember. I write this with love, longing, and a palpable sense of relief that it’s all over. I have come to praise Flash, to curse it, and finally to bury it.

We’ve been hearing about the death of Flash for a long time. We know it’s coming. December 2020 has been announced as the official timeframe for removal, but let’s be real about this: it’s dead. It’s super-dead. It’s people-are-selling-Flash-game-archives-on-Steam dead.

That last bit actually makes me happy, because Flash games were a huge part of my childhood, and the archives must be preserved. Before I’d ever heard of video cards, frames per second, and “git gud”, I was whiling away many an hour on disney.com, cartoonnetwork.com, MiniClip, Kongregate, and other sites, looking for games.

I think we’ve established in my previous work that even as a missionary kid, I did not have a social life.

The Internet itself gave me a way to reach out and see beyond my house, my city, and my world, and it was wonderful. Flash was a part of that era when the Internet felt new, fresh, and loaded with potential. Flash never sent anyone abuse, or death threats. Flash was for silly animations, and games that my parent’s computer could just barely handle, after half an hour of downloading.

I even built my first animated navigation menus in Flash, because I didn’t know any better. At all. But those menus looked exactly like the ones I’d designed in Photoshop, so that’s what mattered to me, young as I was.

That was a part of Flash’s charm, really.

What Flash Got Right

Flash Brought Online Multimedia into the Mainstream

Funny story, JavaScript was only about a year old when Flash was released. While HTML5 and JS are the de-facto technologies for getting things done now, Flash was, for many, the better option at launch. JS had inconsistent support across browsers, and didn’t come with a handy application that would let you draw and animate whatever you wanted.

It was (in part) Flash that opened up a world of online business possibilities, that made people realize the Internet had potential rivalling that of television. It brought a wave of financial and social investment that wouldn’t be seen again until the advent of mainstream social networks like MySpace.

The Internet was already big business, but Flash design became an industry unto itself.

Flash Was Responsive

Yeah, Flash websites could be reliably responsive (and still fancy!) before purely HTML-based sites pulled it off. Of course, it was called by other names back then, names like “Liquid Design”, or “Flex Design”. But you could reliably build a website in Flash, and you knew it would look good on everything from 800×600 monitors, to the devastatingly huge 1024×768 screens.

You know, before those darned kids with their “wide screens” took over. Even then, Flash still looked good, even if a bunch of people suddenly had to stop making their sites with a square-ish aspect ratio.

Flash Was Browser-Agnostic

On top of being pseudo-responsive, the plugin-based Flash player was almost guaranteed to work the same in every major browser. Back in a time when Netscape and Internet Explorer didn’t have anything that remotely resembled feature parity, the ability to guarantee a consistent website experience was to be treasured. When FireFox and Chrome came out, with IE lagging further behind, that didn’t change.

While the CSS Working Group and others fought long and hard for the web to become something usable, Flash skated by on its sheer convenience. If your site was built in Flash, you didn’t have to care which browsers supported the <marquee> tag, or whatever other ill-conceived gimmick was new and trendy.

Flash Popularized Streaming Video

Remember when YouTube had a Flash-based video player? Long before YouTube, pretty much every site with video was using Flash to play videos online. It started with some sites I probably shouldn’t mention around the kids, and then everyone was doing it.

Some of my fondest memories are of watching cartoon clips as a teenager. I’d never gotten to watch Gargoyles or Batman: The Animated Series as a young kid, those experience came via the Internet, and yes… Flash. Flash video players brought me Avatar: The Last Airbender, which never ever had a live action adaptation.

Anyway, my point: Flash made online video streaming happen. If you’ve ever loved a Netflix or Prime original show (bring back The Tick!), you can thank Macromedia.

What Flash Got Wrong

Obviously, not everything was rosy and golden. If it was, we’d have never moved on to bigger, better things. Flash had problems that ultimately killed it, giving me the chance, nay, the responsibility of eulogizing one of the Internet’s most important formative technologies.

Firstly, it was buggy and insecure: This is not necessarily a deal-breaker in the tech world, and Microsoft is doing just fine, thank you. Still, as Flash matured and the code-base expanded, the bugs became more pronounced. The fact that it was prone to myriad security issues made it a hard sell to any company that wanted to make money.

Which is, you know, all of them.

Secondly, it was SEO-unfriendly: Here was a more serious problem, sales-wise. While we’re mostly past the era when everyone and their dog was running a shady SEO company, search engines are still the lifeblood of most online businesses. Having a site that Google can’t index is just a no-go. By the time Google had managed to index SWF files, it was already too late.

Thirdly, its performance steadily got worse: With an expanding set of features and code, the Flash plugin just took more and more resources to run. Pair it with Chrome during that browser’s worst RAM-devouring days, and you have a problem.

Then, while desktops were getting more and more powerful just (I assume) to keep up with Flash, Apple went and introduced the iPhone. Flash. Sucked. On. Mobile. Even the vendors that went out of their way to include a Flash implementation on their smartphones almost never did it well.

It was so much of a hassle that when Apple officially dropped Flash support, the entire world said, “Okay, yeah, that’s fair.”

Side note: Flash always sucked on Linux. I’m just saying.

Ashes to Ashes…

Flash was, for its time, a good thing for the Internet as a whole. We’ve outgrown it now, but it would be reckless of us to ignore the good things it brought to the world. Like the creativity of a million amateur animators, and especially that one cartoon called “End of Ze World”.

Goodbye Flash, you sucked. And you were great. Rest in peace. Rest in pieces. Good riddance. I’ll miss you.

 

 

Featured image via Fabio Ballasina and Daniel Korpai.

Source


Source de l’article sur Webdesignerdepot

A comprehensive and thoughtful SEO strategy is what you would turn to if your goal is to improve your website’s visibility and grow traffic and revenue respectively.

While off-page tactics like link building still remain at the top of the agenda, on-page SEO is no less important in the age of semantic search.

Search engines’ attention has gradually shifted from authority alone toward the quality of the content you provide, its structure, its relevance, and the overall user experience, so taking care of those aspects also plays a major role in succeeding online.

In the past, SEO tags proved to have significant impact on rankings, but now tags are one of the most controversial aspects of on-page SEO, surrounded by debates.

Which tags are obsolete now? Which ones are as crucial as ever?

To answer these questions, it’s important to understand the role of each type of tag and evaluate the impact it may have in terms of user- and search-friendliness.

tags

Whether these are meta tags like title and description, or other tags classifying or organizing the content – the way we use tags and their relative impact on rankings has naturally changed over the years.

As the search engines got smarter at reading and interpreting data, using all kinds of tags in a manipulative manner has become obsolete. However, new tags and new ways of organizing data entered the game, and by changing the approach a bit, one can make great use of both old and new ones.

Let’s dive into the variety of tags and investigate their SEO importance.

Title Tags

A title tag is an HTML attribute from the <header> section that specifies the title of a webpage. It typically appears as a clickable headline in the SERPs and also shows up on social networks and in browsers.

Title tags are meant to provide a clear and comprehensive idea of what the page’s content is about. But do they have a major impact on rankings as they used to for many years?

On the one hand, they are no longer “a cure for all ills,” as explicit keyword stuffing just doesn’t seem to convince Google anymore. On the other hand, well-written optimized titles and higher rankings still do go hand in hand, even though the direct correlation got weaker.

Over the past few years, user behavior factors were being discussed a lot as logical proof of relevance and thus a ranking signal – even Google representatives admit its impact here and there.

The page’s title still is the first thing for a searcher to see in SERPs and decide if the page is likely to answer the search intent. A well-written one may increase the number of clicks and traffic, which have at least some impact on rankings.

A simple experiment can also show that Google no longer needs your title tag to include an exact match keyword to know the topic the page covers.

For instance, if you search for [how to build brand awareness] on Google, you’ll only see one result (Position 7) in the top 10 with the exact match phrase in the title:

how-to-build-brand-awareness Google SERP

This shows how search engines are getting more powerful in reading and understanding the content and the context rather than relying on keyword instances alone.

You can see how the title isn’t the cure-all, but is a crucial piece of the puzzle that proves your page is relevant and rank-worthy.

Search engines are now taking a more comprehensive picture into account, and tend to evaluate page’s content as a whole, but the cover of a book still matters – especially when it comes to interaction with searchers.

Following best SEO practices, you should:

  • Give each page a unique title that describes the page’s content concisely and accurately.
  • Keep the titles up to 50-60 characters long (for them not to get truncated in the SERPs).
  • Put important keywords first, but in a natural manner, as if you write titles for your visitors in the first place.
  • Make use of your brand name in titles.

Meta Description Tags

Meta description is another paragraph of text placed in the <header> of a webpage and commonly displayed in a SERP snippet along with a title and page URL. The purpose of a meta description is to reflect the essence of a page, but with more details and context.

It’s no secret that meta description hasn’t been an official ranking factor for almost a decade now. However, the importance of meta description tags lies close together with title tag, as it impacts the interaction of a searcher with your site.

  • The description occupies the largest part of a SERP snippet and is a great opportunity to invite searchers to click on your site by promising a clear and comprehensive solution to their query.
  • The description impacts the amount of clicks you get, and may also improve CTR and decrease bounce rates, if the pages’ content indeed fulfills the promises. That’s why the description must be as realistic as it is inviting and distinctly reflect the content.

Surely, no description can perfectly match absolutely all queries you may rank for.

Your meta description can be any length you want. But Google typically only shows around 160 characters in the SERPs – and the snippet Google uses for your site may not be the meta description you’ve written, depending on the query.

Following best SEO practices, you should:

  • Give each page a unique meta description that clearly reflects what value the page carries.
  • Google’s snippets typically max out around 150-160 characters (including spaces).
  • Include your most significant keywords, but don’t overuse them. Write for people.
  • Optionally, use an eye-catchy call-to-action, a unique proposition you offer or additional hints on what to expect – ‘Learn’, ‘Buy’ constructions, etc.

Heading Tags (H1-H6)

Heading tags are HTML tags used to identify headings and subheadings within your content from other types of text (e.g., paragraph text).

The hierarchy goes from H1-H6, historically in a sense of “importance.” H1 is the main heading of a page (visible to users unlike meta title), and the most prominent tag showing what the page is about. H2-H6 are optional tags to organize the content in a way that’s easy to navigate.

The usage of heading tags these days is a source of some debate. While H2-H6 tags are considered not as important to search engines, proper usage of H1 tag has been emphasized in many industry studies. Apart from that, clumsy usage of H1s may keep a site from major rankings and traffic improvements.

Utilizing the heading tags certainly adds up to the architecture of the content.

  • For search engines, it’s easier to read and understand the well-organized content than to crawl through structural issues.
  • For users, headings are like anchors in a wall of text, navigating them through the page and making it easier to digest.

Both these factors raise the importance of careful optimization, where small details add up to the big SEO- and user-friendly picture and can lead to ranking increases.

Following best SEO practices, you should:

  • Give each page a unique H1 reflecting the topic the page covers, using your primary keywords in it.
  • Use H2-H6 tags where appropriate (normally, there’s no need to go further than H3), using secondary keywords relevant to each paragraph.
  • Don’t overuse the tags and the keywords in them. Keep it readable for users.

Italic/Bold Tags

Italic and bold tags can be used to highlight most important parts of the content and to add a semantic emphasis on certain words.

In terms of SEO, it is commonly being said that bots may appreciate such little tweaks, but won’t care too much really.

Thereby, these are not crucial kinds of tags to utilize, yet again they may improve readability and user experience, and this will never hurt – bots tend to appreciate what’s appreciated by searchers.

Following best SEO practices, you should:

  • Only use these tags where it really makes sense. Steer clear of excessive use.
  • Scan a piece of content as a whole, to make sure it isn’t overloaded with accents and is comfortable to read and digest.

Meta Keywords Tags

At the beginning of the optimization race, meta keywords used to be small snippets of text only visible in the code, that were supposed to tell the search engines what topics the page relates to.

Naturally, over the years the tag turned into a breeding ground for spamming and stuffing, instead of honestly optimizing the content.

Now, it’s a well-known fact that Google ignores meta keywords completely – they neither impact the rankings, nor would cause a penalty if you stuff it up.

Bottom line: meta keywords are pretty much obsolete and not worth wasting too much of your time on.

Following best SEO practices, you should:

Image Alt Tags

The image alt tag is an HTML attribute added to an image tag to describe its contents. Alt tags are important in terms of on-page optimization for two reasons:

  • Alt text is displayed to visitors if any particular image cannot be loaded (or if the images are disabled).
  • Alt tags provide context, because search engines can’t “see” images.

For ecommerce sites, images often have crucial impact on how a visitor interacts with a page.

Google also says it outright: helping search engines understand what the images are about and how they go with the rest of the content may help them serve a page for suitable search queries.

Additionally, a clear and relevant description digestible for search engines raises your chances to appear among Google Images results.

Following best SEO practices, you should:

  • Do your best to optimize most prominent images (product images, infographics, or training images), images that are likely to be looked up in Google Images search.
  • Add alt text on pages where there’s not too much content apart from the images.
  • Keep the alt text brief and clear, use your keywords reasonably and make sure they fit naturally into the whole canvas of page’s content.

Nofollow Link Tags

External/outbound links are the links on your site pointing to other sites. Naturally, these are used to refer to proven sources, point people towards other useful resources, or mention a relevant site for some other reason.

These links matter a lot for SEO: they can make your content look like a hand-crafted comprehensive piece backed up by reliable sources, or like a link dump with not so much valuable content.

Google’s well-known for its severe antipathy to any manipulative linking tactics, sticking to which can cause a penalty, and it doesn’t get any less smart at detecting those.

Apart from that, in the age of semantic search, Google may treat the sources you refer to as the context, to better understand the content on your page. For both these reasons, it’s definitely worth paying attention to where you link, and how.

By default, all hyperlinks are dofollow, and when you place a dofollow link on your site, you basically ‘cast a vote of confidence’ to the linked page.

When you add a nofollow attribute to a link, it instructs search engines’ bots not to follow the link (and not to pass any link equity). Keeping your SEO neat, you would preserve a healthy balance between follow and nofollow links on your pages, but would normally set the following kinds of links to nofollow:

  • Links to any resources that in any way can be considered as “untrusted content.”
  • Any paid or sponsored links (you wouldn’t want Google to catch you selling your “vote”).
  • Links from comments or other kinds of user-generated content which can be spammed beyond your control.
  • Internal “Sign in” and “Register” links following, which is just a waste of crawl budget.

Robots Tags

A page-level noindex tag is an HTML element that instructs the search engines not to index given page. A nofollow tag instructs not to follow any links on that page.

While these tags don’t correlate with rankings directly, in some cases they may have some impact on how your site looks in the eyes of search engines overall.

For instance, Google highly dislikes thin content. You may not generate it intentionally, but happen to have some pages with little value for users, but necessary to have on the site for some reason.

You may also have “draft” or placeholder pages that you need to publish while they are not yet finished or optimized to their best. You probably wouldn’t want such pages to be taken into account while evaluating the overall quality of your site.

In some other cases, you may want certain pages to stay out of SERPs as they feature some kind of special deal that is supposed to be accessible by a direct link only (e.g., from a newsletter).

Finally, if you have a sitewide search option, Google recommends to close custom results pages, which can be crawled indefinitely and waste bot’s resources on no unique content.

In the above cases, noindex and nofollow tags are of great help, as they give you certain control over your site as it’s seen by the search engines.

Following best SEO practices, you should:

  • Close unnecessary/unfinished pages with thin content that have little value and no intent to appear in the SERPs.
  • Close pages that unreasonably waste crawl budget.
  • Make sure carefully you don’t mistakenly restrict important pages from indexing.

Canonical Tags

Canonical tag (rel=”canonical”) is a way of telling search engines which version of a page you consider the main one and would like to be indexed by search engines and found by people.

It’s commonly used in cases when the same page is available under multiple different URLs, or multiple different pages have very similar content covering the same subject.

Internal duplicate content is not treated as strictly as copied content, as there’s usually no manipulative intent behind it. Yet this may become a source of confusion to search engines: unless you indicate which URL is the one you prefer to rank with, search engines may choose it for you.

The selected URL gets crawled more frequently, while the others are being left behind. You can see that while there’s almost no penalty risk, such state of affairs is far not optimal.

Another benefit is that canonicalizing a page makes it easier to track performance stats associated with the content.

John Mueller also mentions that using a rel=canonical for duplicate content helps Google consolidate all your efforts and pass the link signals from all the page’s versions to the preferred one. That is where using the canonical tag may help you steer the SEO effort in one direction.

Following best SEO practices, you should canonicalize:

  • Pages with similar content on the same subject.
  • Duplicate pages available under multiple URLs.
  • Versions of the same page with session IDs or other URL Parameters that do not affect the content.

Schema Markup

Schema markup is a shared markup vocabulary recognized by search engines, letting you organize data in a logical way. It has been on everyone’s lips lately as one of the most underrated tweaks.

A “semantic web” is a “meaningful web,” where the focus shifts from keywords instances and backlinks alone to concepts behind them and relationships between those concepts. Structured data markup is exactly what helps search engines to not only read the content but also understand what certain words relate to.

The SERPs have evolved so much that you may not even need to click through the results to get an answer to your query. But if one is about to click, a rich snippet with a nice pic, a 5-star rating, specified price-range, stock status, operating hours or whatever is useful – is very likely to catch an eye and attract more clicks than a plain-text result.

Assigning schema tags to certain page elements makes your SERP snippet rich on information that is helpful and appealing for users. And, back to square one, user behavior factors like CTR and bounce rate add up to how search engines decide to rank your site.

Following best SEO practices, you would:

  • Study available schemas on schema.org.
  • Create a map of your most important pages and decide on the concepts relevant to each.
  • Implement the markup carefully (using Structured Data Markup Helper if needed).
  • Thoroughly test the markup to make sure it isn’t misleading or added improperly.

Social Media Meta Tags

Open Graph was initially introduced by Facebook to let you control how a page would look when shared on social media. It is now recognized by Google+ and LinkedIn as well. Twitter cards offer similar enhancements, but are exclusively to Twitter.

By using these social media meta tags, you can provide a bit more information about your page to social networks. By enhancing the appearance, you make the shared page look more professional and inviting, and increase the likelihood of clicking on it and sharing it further. This is not a crucial tweak, but it’s an absolutely nothing-to-lose one, with a couple of potential benefits.

To ensure your pages look good when shared across social media platforms, you would:

Viewport Meta Tag

Viewport meta tag allows you to configure how a page would be scaled and displayed on any device. Commonly, the tag and the value would look as follows:

<meta name=”viewport” content=”width=device-width, initial-scale=1″>

Where “width=device-width” will make the page match the screen’s width in device-independent pixels, and “initial scale=1” will establish a 1:1 relationship between CSS pixels and device-independent pixels, taking screen orientation into account.

This tag is a no-brainer to add, but one screenshot from Google is enough to show the difference it makes:

Viewport meta tag has nothing to do with rankings directly but has a tone to do with the user experience, especially considering the variety of devices that are being used nowadays and the noticeable shift to mobile browsing.

Same way as many of the above tags and tweaks, taking care of it will be appreciated by users (or, more likely, not taking care of it will be depreciated), and your CTR and bounce rates shall reflect the small efforts you make accordingly.

Conclusion

To get most of your on-page strategy, don’t neglect the small tweaks that add up to the big picture.

As for now, some tags are still must-have as they make up the taxonomy of your page; others are not vital, but can let you be one rich snippet ahead of competitors who just didn’t bother.

Small changes that improve user experience and help search engines understand your site better will be appreciated by both sides, and will definitely pay off in the long run.

More SEO Resources:


Image Credit

Screenshot taken by author (from Google Developers), June 2018

By Search Engine Journal Source : https://ift.tt/2sNS4GJ

Qwant et l’Inria annoncent un partenariat sur une durée de 4 ans. Celui-ci aura pour objectif la mise en place un laboratoire commun entre les deux entités chargées de travailler sur les problématiques liées à la vie privée et aux moteurs de recherche.
Source de l’article sur ZDNet