This page looks best with JavaScript enabled

kksctf: Lynx

 ·  ☕ 1 min read

Solution

We are given a url. However when we make a request it seems that there is some kind of protection:

[jusepe@nix:~]$ curl http://tasks.kksctf.ru:30070/ && echo ""
You are not lynx, access denied

The message it’s a clear indicator that we need to use lynx in order to see the content, the command would be lynx http://tasks.kksctf.ru:30070/.

Homepage

There is a reference to robots.txt in the home page so let’s check it using G inside lynx to rewrite the url:

This steps could be accomplished without lynx, it is only protected the root directory

Robots.txt

It seems that there is a hidden directory that finally contains the flag:

Robots.txt

Alternative path

In order to whitelist specific web browsers the website may be blocking based of User-Agent header, so used burpsuite to check what header was being used with lynx with the following commands:

[jusepe@nix:~]$ export http_proxy=http://127.0.0.1:8080
[jusepe@nix:~]$ lynx http://tasks.kksctf.ru:30070/

Here is the intercepted request:

Robots.txt

Now we can use curl with the same User-Agent and check how we bypass the protection:

[jusepe@nix:~]$ curl http://tasks.kksctf.ru:30070/ -A "Lynx/2.9.0dev.6"

        <!DOCTYPE html>
<html>
    <head>
        <title>Code panel</title>
        <script type="text/javascript" src="code.js"></script>
    </head>
    <body>
        <center> WELCOME </strong>
        </br>
        </br>
        </br>
        <p>Let's defend our friend - Lynx - from robots!</p>
        </br>
        </br>
        </br>
        </br>
        </br>
        </br>
        </br>
        <footer> 
        <center>(C) BluePeace, 2053</center>
        </footer>
    </body>
</html>
Share on

ITasahobby
WRITTEN BY
ITasahobby
InTernet lover