Google

Follow Us

miércoles, 28 de mayo de 2014

SEO Offpage

Parametros/Tools

  • Alexa Rank
  • Popularidad
  • Page Rank

Tools y Atributos de los Links
/
  • Updates
  •  


Strategie/Tools
  • Links
  • Link Baiting
  • Tendencias y tacticas
seo off page campaigns
Beginn: 29.05.2014 08:00AM
Ende: 2014-05-30:00.000
Alexanderstrasse 3
Berlin
10179
Deutschland

Seo Onpage Optimization

Meta information

Resources: MOZ
                    40defiebre
                     

<head>

 for the Search engine

<meta>

<meta name=”title” content= “keyword”/>

Title Tag
The title element of a web page is meant to be an accurate and concise description of a page's content. This element is critical to both user experience and search engine optimization. It creates value in three specific areas: relevancy, browsing, and in the search engine results pages. It is not relevant for Google.

Code Sample

<head>
<title>Example Title</title>
</head> 


<meta name=”keywords” content=”keywords1, keyword 2 that is not important for Google

<meta name=”description” content=”keyword” /> 
Meta descriptions are HTML attributes that provide concise explanations of the contents of web pages. Meta descriptions are commonly used on search engine result pages (SERPs) to display preview snippets for a given page.

Code Sample
<head>
<meta name="description" content="This is an example of a meta description. This will often show up in search results.">
</head> 

Optimal Length for Search Engines

The description should optimally be between 150-160 characters. Roughly 155 Characters
Avoid Duplicate Meta Description Tags
As with title tags, it is important that meta descriptions on each page be unique. One way to combat duplicate meta descriptions is to create a dynamic and programmatic way to make unique meta descriptions for automated pages.

Not a Google Rank Factor

Google announced in September of 2009 that neither meta descriptions nor meta keywords factor into Google's ranking algorithms for web search. Google uses meta descriptions to return results when searchers use advanced search operators to match meta tag content, as well as to pull preview snippets on search result pages, but it's important to note that meta descriptions do not to influence Google's ranking algorithms for normal web search.

<meta name=”author” content=”autorenname”>
<a href=’http://plus.google.com/ProfilID’ rel=’author’>uber mich</a>
<a href=’http://plus.google.com/seinenlID’ rel=’author’>Estefaniaojea</a>

Google crawlers

Crawler" is a generic term for any program (such as a robot or spider) used to automatically discover and scan websites by following links from one webpage to another. Google's main crawler is called Googlebot.
https://developers.google.com/webmasters/control-crawl-index/docs/getting_started?csw=1
Robots meta tag
Some pages use multiple robots meta tags to specify directives for different crawlers, like this: <meta name="robots" content="nofollow"><meta name="googlebot" content="noindex">
In this case, Google will use the sum of the negative directives, and Googlebot will follow both the noindex and nofollow directives. More detailed information about controlling how Google crawls and indexes your site.

Sitemaps.xml


Sitemaps are a way to tell Google about pages on your site we might not otherwise discover. In its simplest terms, an XML Sitemap is a list of the pages on your website. Creating and submitting a Sitemap helps make sure that Google knows about all the pages on your site, including URLs that may not be discoverable by Google's normal crawling process.


HTTP status codes


302 Found

Robots.txt 



The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.

Allow all web crawlers from all content
User-agent: * Disallow:

Block all web crawlers from all content
User-agent: * Disallow: /

Block a specific web crawler from a specific folder
User-agent: Googlebot Disallow: /no-google/

Block a specific web crawler from a specific web page
User-agent: Googlebot Disallow: /no-google/blocked-page.html

Allow a specific web crawler to visit a specific web page
Disallow: /no-bots/block-all-bots-except-rogerbot-page.html User-agent: rogerbot Allow: /no-bots/block-all-bots-except-rogerbot-page.html

Sitemap Parameter
User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml

Important Rules


  • Only one "Disallow:" line is allowed for each URL.
  • Each subdomain on a root domain uses separate robots.txt files.
  • Spacing is not an accepted way to separate query parameters. For example, "/category/ /product page" would not be honored by robots.txt.
  • In order to exclude individual pages from search engine indices, the noindex meta tag <meta name="robots" content="noindex"> is actually superior to robots.txt.


Most Important Rank Factors 

1. Page Level Factors

  • Keyword in Title Tag
  • Title that starts with keyword
  • Duplicate content is bad
  • Rel=canonical
  • Keyword in H2, H3 Tags
  • Proper Grammar and Spelling
  • Too many broken links
  • URLs length


2. Site Level Factors


  • Content provides value
  • Domain Trust
  • Presence of Sitemaps
  • Presence of youtube videos
  • Site usability


3. Backlinks factors


  • link location on page
  • Backlink age
  • Schema.org microformas ( webs with microformats ranks better)


4. User interaction
5. Special algorithm rules
6. Social signes
7. Brand signals
8. On Site Webspam factors
9. Off Page Web Spam factors


What is the Permalink? 





lunes, 26 de mayo de 2014

Display Werbung

Thema 1 


Display Marketing Grounds Display Marketing Grundlagen


AIDA is an acronym used in marketing and advertising that describes a common list of events that may occur when a consumer engages with an advertisement.
A – attention (Awareness): attract the attention of the customer. I – interest: raise customer interest by focusing on and demonstrating advantages and benefits (instead of focusing on features, as in traditional advertising). D – desire: convince customers that they want and desire the product or service and that it will satisfy their needs. A – action: lead customers towards taking action and/or purchasing.





Predictive Behavioural Targeting
Behavioral Targeting refers to a range of technologies and techniques used by online website publishers and advertisers aimed at increasing the effectiveness of advertising using user web-browsing behavior information. In particular, "behavioral targeting uses information collected from an individual’s web-browsing behavior (e.g., the pages that they have visited or the searches they have conducted) to select advertisements to display".[1]

Big Announcement


Guide to Online marketing video
When a consumer visits a web site, the pages they visit, the amount of time they view each page, the links they click on, the searches they make and the things that they interact with, allow sites to collect that data, and other factors, create a 'profile' that links to that visitor's web browser. As a result, site publishers can use this data to create defined audience segments based upon visitors that have similar profiles. When visitors return to a specific site or a network of sites using the same web browser, those profiles can be used to allow advertisers to position their online ads in front of those visitors who exhibit a greater level of interest and intent for the products and services being offered. On the theory that properly targeted ads will fetch more consumer interest, the publisher (or seller) can charge a premium for these ads over random advertising or ads based on the context of a site.
Behavioral marketing can be used on its own or in conjunction with other forms of targeting based on factors like geography, demographics or contextual web page content. It's worth noting that many practitioners also refer to this process as "Audience Targeting".

Onsite Behavioral Targeting

Most platforms identify visitors by assigning a unique id cookie to each and every visitor to the site thereby allowing them to be tracked throughout their web journey, the platform then makes a rules-based decision about what content to serve.

Most platforms identify visitors by assigning a unique id cookie to each and every visitor to the site thereby allowing them to be tracked throughout their web journey, the platform then makes a rules-based decision about what content to serve.
Self-learning onsite behavioral targeting systems will monitor visitor response to site content and learn what is most likely to generate a desired conversion event. Some good content for each behavioral trait or pattern is often established using numerous simultaneous multivariate tests. Onsite behavioral targeting requires relatively high level of traffic before statistical confidence levels can be reached regarding the probability of a particular offer generating a conversion from a user with a set behavioral profile. Some providers have been able to do so by leveraging its large user base, such as Yahoo!. Some providers use a rules based approach, allowing administrators to set the content and offers shown to those with particular traits.
Contextual Targeting
The process that matches ads to relevant sites in the Display Network using your keywords or topics, among other factors.
  • Here's how it works: Google's system analyzes the content of each webpage to determine its central theme, which is then matched to your ad using your keywords and topic selections, your language and location targeting, a visitor's recent browsing history, and other factors.
  • AdWords uses contextual targeting when an ad group has keywords or topics and its campaign is set to show ads on the Display Network