Merge lp:~widelands-dev/widelands-website/static_robots_txt into lp:widelands-website

Proposed by kaputtnik
Status: Merged
Merged at revision: 445
Proposed branch: lp:~widelands-dev/widelands-website/static_robots_txt
Merge into: lp:widelands-website
Diff against target: 35 lines (+13/-0)
2 files modified
templates/robots.txt (+10/-0)
urls.py (+3/-0)
To merge this branch: bzr merge lp:~widelands-dev/widelands-website/static_robots_txt
Reviewer Review Type Date Requested Status
SirVer Approve
kaputtnik (community) Needs Resubmitting
Review via email: mp+313398@code.launchpad.net

Description of the change

Add a robots.txt.

The content is discussed with janus and should be ok.

To post a comment you must log in.
449. By kaputtnik

allow polls and screenshots

Revision history for this message
SirVer (sirver) wrote :

I feel really uncomfortable about the black listing, I think it should be a whitelist.

Otherwise some nits.

review: Needs Information
450. By kaputtnik

reverted to r446

451. By kaputtnik

added whitespace; cleanup robots.txt

Revision history for this message
kaputtnik (franku) wrote :

So the first version was better than the adjusted version.

Thanks for looking into it :-)

review: Needs Resubmitting
Revision history for this message
SirVer (sirver) wrote :

> So the first version was better than the adjusted version.

Well - we did not know then though :). Software development is iterative in nature, you sometimes have to take the roundabout way to get back were you started - but only with the new knowledge acquired on the way you can truly judge that the new/old solution is the best.

lgtm.

review: Approve
Revision history for this message
kaputtnik (franku) wrote :

Thanks :-)

Merged and deployed.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== added file 'templates/robots.txt'
2--- templates/robots.txt 1970-01-01 00:00:00 +0000
3+++ templates/robots.txt 2016-12-16 08:27:03 +0000
4@@ -0,0 +1,10 @@
5+# robots.txt for wl.widelands.org
6+
7+# These things should never be crawled
8+User-agent: *
9+Disallow: /profile
10+Disallow: /admin
11+Disallow: /accounts
12+
13+# url to sitemap
14+Sitemap: https://wl.widelands.org/sitemap.xml/
15
16=== modified file 'urls.py'
17--- urls.py 2016-12-13 18:28:51 +0000
18+++ urls.py 2016-12-16 08:27:03 +0000
19@@ -7,6 +7,7 @@
20 from mainpage.views import mainpage
21 from news.feeds import NewsPostsFeed
22 from django.views.generic.base import RedirectView
23+from django.views.generic import TemplateView
24 from django.contrib.syndication.views import Feed
25 from registration.backends.hmac.views import RegistrationView
26 from mainpage.forms import RegistrationWithCaptchaForm
27@@ -15,6 +16,8 @@
28 urlpatterns = [
29 # Creating a sitemap.xml
30 url(r'^sitemap\.xml/', include('sitemap_urls')),
31+ # Static view of robots.txt
32+ url(r'^robots\.txt/', TemplateView.as_view(template_name='robots.txt', content_type="text/plain")),
33
34 # Uncomment the next line to enable the admin:
35 url(r'^admin/', admin.site.urls),

Subscribers

People subscribed via source and target branches