From Github to Gitlab

Posted: 2018-07-08 15:54:20

Direct Link | RSS feed


I've had a Gitlab account for a couple of years because of their facility for free Private repos and since then I have pondered moving all my Public Github repos across but never had the inspiration. The main reason not to was because Github was the de-facto Source Control site, so when dealing with non-tech people like recruiters, it was easier to just point people at Github rather than explaining the lesser known (at the time) Gitlab.

Now with the acquisition of Github by Microsoft, I've got the inspiration. With the exodus of many repos to Gitlab, they now have a much greater reputation outside the tech-circle so my code has now been moved.

The Github projects still exist but just have a README.md directing people to Gitlab.

https://gitlab.com/alasdairkeyes


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Perl Capture::Tiny

Posted: 2018-05-31 07:01:22

Direct Link | RSS feed


For such a versatile language, Perl has no good built-in way of running an external commands.

You can obviously use exec(), system() or even the dreaded back ticks ``. But often when running commands you require different things. Maybe you just need an exit code, so system() is good. If you're looking to capture output, exec() or backticks are more suited for the job, but then escaping input and using pipes or redirects then becomes a nightmare.

Last year worked on a project that required a large amount of systems integration and calling binaries, to run tasks and process output so I had to find a reliable and useful method. Mixing system(), exec() was possible but as I would be handing the code off to the customer, having a single way of running commands is a lot cleaner.

I came across https://metacpan.org/pod/Capture::Tiny. It provides a clean interface for capturing output from both Perl code and running system binaries.

#!/usr/bin/env perl

use strict;
use warnings;
use Capture::Tiny ':all';
use v5.20;

my ($stdout, $stderr, $exit_code) = capture {
    system("ls", "/var");
};

say "STDOUT\n$stdout";
say "STDERR\n$stderr";
say "EXITCODE: $exit_code";
STDOUT
backups
cache
games
lib
local
lock
log
mail
opt
run
spool
tmp

STDERR

EXITCODE: 0

The only flaw seems to be that it doesn't handle the capturing of output long running processes that will spit out text slowly. For that you will want to use something like pump on https://metacpan.org/pod/IPC::Run


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Quick wins for your dev team

Posted: 2018-05-30 16:07:47

Direct Link | RSS feed


The joy of contracting is that I often get to work on brand new code-bases with some regularity. The down side is exactly the same.

Over a number of contracts I've noticed a number of things that I've suggested to teams that make their development much easier.

If you work for a fair-sized business or a 'tech' company you can stop reading here, you will look at these suggestions, scoff, and wonder why they need making. However there are a lot of companies that are not 'tech' focused. Their website functionality was bolted onto an existing business that pre-dates the web and the first site was a few scripts written by the most tech-savvy employee they had at the time and grew organically from there. Yes, these code-bases are often a mess and constant tight deadlines for new features and product means it always stays that way and doesn't get improved, but that's reason not to try. It's usually the devs that have to work with the code that get most jaded with the situation, having to work with the code day-in and day-out.

These suggestions are mainly based around PHP, but they can be easily adapted for any other language.

Version Control

This might seem like a no-brainer, but I still see teams without Version control (or still stuck with CVS/SVN). Creating an account through Gitlab is highly recommended, not only do you get their git system, they also have free CI/CD pipelines to allow automated testing which will come in handy if you wish to use these later.

Upside

  • Well, you get version control. Multiple devs can work on the same code with less risk of implementing breaking changes.

Downside

  • If your team are unfamiliar, a little training may be required to get going. A morning reading up on Gitflow and a few articles on merging will be more than enough to familiarise a team with basic concepts and branching/merging.

Consistent Clean Code Formatting and naming

We've all seen code like this, or if you're unlucky with even less line breaks or line breaks in crazy places

public function fib($n) {if ($n < 0) { return NULL; }
elseif ($n === 0) { return 0; } elseif ($n === 1 || $n === 2) { return 1; }
else { return fibonacci($n-1) + fibonacci($n-2); }}

The code's purpose is not easily understood and the name fib doesn't instantly describe what the function is doing.

PSR-2 is the PHP Framework Interoprability Group's approach which has become the defacto standard for PHP and will make your code much nicer to work with.

Along with nice naming you can turn the above into...

public function fibonacci($n) {
    if ($n < 0) {
        return NULL;
    } elseif ($n === 0) {
        return 0;
    } elseif ($n === 1 || $n === 2) {
        return 1;
    } else {
        return fibonacci($n-1) + fibonacci($n-2);
    }
}

You don't have to strictly adhere to a well-known standard, but using a consistent, clean formatting approach can make your life just a little bit easier.

You can look at doing it in one fell swoop with a tool such as PHP Coding Standards Fixer or do a gradual merge whereby all files that are edited with new code get reformatted before any changes and committed to the repo. Tools such as PHPStorm also have a formatting feature available using CTRL+ALT+L

Upside

  • You'll restore some sanity to your devs

Downside

  • Reformatting large swathes of your code in one fell swoop will make you nervous. A more cautious approach could be to reformat code before you do new work on a class/file, reformat it, commit it, then do your work.

Adopt a standard documentation format

Each language has it's own documentation format PHPDoc for PHP, Perldoc for Perl etc. What you use isn't so much the key as just using it. A quick one line introduction to a class/function plus documentation on the arguments expected and return values will make your code much more pleasant for people unfamiliar with it.

Again, a rolling adoption is easy and fairly hassle free. Write the required docs for new functions, and then as you edit existing ones, back-fill your documentation. In addition if you use code reviews, get people to check documentation updates as well as the code being committed.

Upside

  • Over a short amount of time, you can generate usable documentation for your code base.
  • If your IDE supports parsing of documentation like PHPDoc/PHPStorm, then it will speed up your development by showing you function arguments and alerting on exceptions that need to be caught without having to search through your code.

Downside

  • There isn't really any, it only adds a few minutes more time spent developing your code.

Basic tests

The chances are if you have not been writing tests from the start, it will not be easy to start introducing tests to old code. The primary reason is probably you won't have written your code in a manner to allow easy testing, you won't be injecting dependencies and your functions will be performing too much work to create concise tests.

Don't let this put you off. Simple tests can be put in place quite easily for even the most attrocious code base.

Firstly, if your team is unfamiliar, take a morning to read up on the different types of tests commonly used (unit, integration, functional, Behaviour etc), the benefits they provide and when you are likely to use them. There are plenty of pages and Youtube videos available via Google to get you started.

Secondly, look through your code base for simple functions that are able to have unit tests created for them. These might be few and far between to start with, but once the tests are written, you can get a little more piece of mind that some future commit is less likely to break functionality without anyone noticing.

If you use a common framework, most of these provide a customised test suite to enable rapid test development for WebApp functionality. A simple, quick and useful test could be to ensure that all access to login restricted URLs returns a suitable message and/or HTTP status code along with a redirect to a login page. This can then perhaps be extended to test valid/invalid login messages and redirects. It's a small step, but you're heading in the right direction.

Testing is a huge subject and you could easily get drawn down a rabbit hole, however, with just a few test, you are already more certain of stopping some errors creeping into your code.

Upside

  • Basic functionality can be tested with little overhead to ensure that your code is doing what it should be.

Downside

  • Getting more than some basic tests written will likely require a lot of refactoring of your code... which requires tests to ensure you don't break anything, it can get messy if not tackled correctly.

This isn't an exhaustive list, if you find that you are in need of implementing some of these you will likely find that there are lots more changes required in your development processes, but these will allow you to gain a lot of benefit without tying up devs for days or months.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

expandtab in vim 8 on Debian 9 not working

Posted: 2018-05-28 14:41:44

Direct Link | RSS feed


Last week I finished rolling out my new Debian 9 "Stretch" machines and noticed some peculiar behaviour in vim. Namely, my tabs were being kept as tabs and not converted automatically to spaces. We'll ignore the old "tabs vs. spaces" debate for the moment but I was struggling to see why my tried and tested vim config wasn't working.

My .vimrc file had the following lines

" Set tabs to 4 spaces
set shiftwidth=4
set ts=4
set expandtab

" Allow simple pasting and add ruler/linecounts
set paste

After some hunting I found the following article explaining the issue https://bugs.launchpad.net/ubuntu/+source/vim/+bug/1576583

Debian 9 moves to vim version 8.0 and it seems that the behaviour of paste has changed so that it resets expandtab. All that's required is to move the set paste line before expandtab and you're back in business. Hopefully this might help someone else who has seen the same issues.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

keys.gnupg.net website pool broken

Posted: 2018-04-24 08:30:10

Direct Link | RSS feed


Until recently, the GnuPG section of my site which lists my GPG fingerprint had a link to keys.gnupg.net for visitors to make some verification of my key.

I accidentally clicked this link a few days ago and noticed that I was redirected to https://analytics.sumptuouscapital.com/ instead of the expected https://keys.gnupg.net website. I checked with Mike and he was seeing the correct site.

This looked interesting, it could be some misconfiguration or potentially something more nefarious like a DNS poisoning.

I dug into it a little and it looks like gnupgp.net operates a round-robin DNS setup for it's web server cluster with 9 hosts.

$ host -t A keys.gnupg.net
keys.gnupg.net is an alias for hkps.pool.sks-keyservers.net.
hkps.pool.sks-keyservers.net has address 193.224.163.43
hkps.pool.sks-keyservers.net has address 193.164.133.100
hkps.pool.sks-keyservers.net has address 176.9.147.41
hkps.pool.sks-keyservers.net has address 192.94.109.73
hkps.pool.sks-keyservers.net has address 51.15.53.138
hkps.pool.sks-keyservers.net has address 216.66.15.2
hkps.pool.sks-keyservers.net has address 68.187.0.77
hkps.pool.sks-keyservers.net has address 92.43.111.21
hkps.pool.sks-keyservers.net has address 37.191.226.104

I wrote a small script to query each individual IP for the keys.gnupg.net website, the result was

  1. 37.191.226.104: Redirects to https://analytics.sumptuouscapital.com/
  2. 192.94.109.73: No response
  3. 193.164.133.100: Redirects to https://keys.gnupg.net
  4. 18.9.60.141: No response
  5. 68.187.0.77: No response
  6. 216.66.15.2: No response
  7. 176.9.147.41: No response
  8. 51.15.53.138: No response
  9. 193.224.163.43: No response

On the plus side, it doesn't look to be anything nefarious, just lack of maintenance and competence. It looks as though the GnuPG keys webs server setup is really broken, I have no idea how long this has been broken in this way, but it doesn't scream 'secure'.

As such, I've removed the link from my site and I now just use pgp.mit.edu and I suggest you stop using it too.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Have you been pwned? Maybe not as fully as you think

Posted: 2018-03-02 09:10:09

Direct Link | RSS feed


For those interested in security breaches you are probably aware of the existence of the site Have I Been Pwned (HIBP) run by Troy Hunt.

I've used it to check email addresses in the past, however Troy has added some new useful features to the site of the the past few years.

I gave the domain search option a go. Instead of searching for just an address, you give your domain and it identifies all email Aliases that have been found in compromised lists for that domain. https://haveibeenpwned.com/DomainSearch If you operate your own personal or your company domain(s), it's well worth looking into.

It's very straight forward and you can validate domain ownership using a number of methods such as DNS, Email, HTTP and download the information in various formats such as MS Excel or JSON.

When reviewing this information, one thing I noticed is that in the Onliner Spambot breach there were quite a few aliases listed on my domains that I don't, nor have ever used. In particular, I've owned the akeyes.co.uk domain since 2005 and it was unregistered before then, so it's unlikely to be from a previous domain owner. In fact on akeyes.co.uk only 2 out of 9 listed aliases would ever have been used and able to receive emails and therefore used to access online services.

My first thought was that these aliases were there as part of a scatter-gun approach to spam, however as the leak they were from also contained passwords or password hashes there are some other possible inferences from this data.

  1. There's no indication as to which aliases had passwords, apparently not all did, but as the leak description outlines "many of which were also accompanied by corresponding passwords" we can assume over 25% did. If these addresses have never been used for either mail or online services, it would seem that having a legitimate password is unlikely. This could mean perhaps a password was obtained for leakedemail@domain.com and then tried against other common aliases on the same domain in an attempt at compromising a mail server account. This would be a far more efficient way of trying to compromise a mailbox than just trying known passwords from other domains.

  2. Although the sale of personal/account details on is quite prevalent, the cost per-email/password combination is very low. If this list was obtained via the purchase of compromised details, it could indicate that sellers on the black-market are padding out their lists with dummy addresses and passwords/password hashes to be more appealing to buyers.

  3. Nefarious types may have tried to sign up to online services with email addresses on my domain for online services which have later been exploited. This might be quite common with well known domains such as microsoft.com etc but I'd say this is unlikely on domains as unknown as mine unless an online service had a known issue that could somehow be exploited in this manner.

When we hear of compromised data of 100 Million users being leaked, it could be worth bearing in mind that a fair proportion of these may be fake, or at least have dubious origin. This doesn't make the security breaches and data leaks any less serious as they will contain real information as well and sites like HIBP are doing good work allowing people to be aware of compromises and hopefully holding some to account.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

JetBrains IntelliJ Community Edition

Posted: 2018-02-28 22:46:48

Direct Link | RSS feed


Having used JetBrains' PHPStorm for a long whilst in my PHP dev roles, I was interested to try their IntelliJ Community Edition offering.

It's the same IDEA based application that PHPStorm is based on, but the main draw for a lot of people will be that it's free.

There are some limitations, for example, it doesn't support the JetBrains' PHP plugin which would turn it into PHPStorm for free. Other unsupported languages are CSS, Ruby, Javascript and others, Full list here. So if your language is supported through a community plugin then you get the power of JetBrains without the cost!

I still do a large amount of Perl Development. Thankfully the fantastic Perl plugin works a treat.

If you're unable to afford the license fees for your chosen JetBrains product, it's worth seeing if this will work for you in it's cut-down form.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Reddit and Hacker News time sink

Posted: 2017-10-28 13:25:49

Direct Link | RSS feed


Four weeks ago, I finally deleted the Hacker News and Reddit apps from my phone.

I get a great deal of entertainment and information from these two sites however, I've found over time that I'll spend more and more time on them. If I feel remotely bored or disinterested, my go-to tool is Reddit. Instead of realising that I had spare time and could use it productively, I would just sink it time into browsing whatever dross was on there.

On top of this I found it would also start affecting my sleep, if I woke up at 4.30am and was unable to get back to sleep I would often grab my phone and browse. This was doing me no favours and so my decision to delete the apps.

I still view both sites on my laptop and I can obviously browse the websites on my phone to, but just removing the ability to load the sites up with one tap and seeing the icons on my screen has really had an effect and I find myself a lot less likely to just browse for the sake of browsing.

This is, of course, no guarantee that I will use my time more productively, but it certainly won't hurt.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Firefox Multi-Account Containers

Posted: 2017-09-18 11:59:05

Direct Link | RSS feed


For anyone that uses Firefox, I strongly recommend you install https://addons.mozilla.org/en-GB/firefox/addon/multi-account-containers/ Multi Account Containers.

It's written by Mozilla themselves and allows you to carve up Firefox into different containers for separation.

The containers are colour coded and each tab has the colour of the container it's running in. There is a Default container which is used for all websites until you decide otherwise.

This means if I open up a new tab in the Personal container and go to Github, I get my personal account. If I open my Work tab and visit the same site, it's logged into my Work account. No more logging in and out or running multiple browsers.

You can also pin websites to specific containers. Create a Finance container and pin your credit card, banking and ISA website into it and whenever you go to those sites it'll automatically open the site in that container. Much less cross-site tracking and also extra security from possible Cross site scripting vulnerabilities.

Do it, do it now.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

iproute and net-tools

Posted: 2017-09-14 17:01:52

Direct Link | RSS feed


For those Linux admins that have been doing their thing for at least 10 years, you will be very familiar with the standard networking tools ifconfig, netstat etc from the net-tools package. As you also know these are no longer being developed and have been deprecated in favour of a newer iproute2 tool set.

This has been the case for many years, but I bet you still type ifconfig and route don't you?... Years of muscle memory is a hard habit to break.

Although the much more recent iproute2 range of tools has been available for a long time, I still find myself using the old ones as well. When I catch myself doing that, I force myself to lookup how to do the same thing in the newer tools. I'm slowly getting there but it'll take many years yet.

As a handy guide, I was passed this by a friend. It's most useful for those trying to transition and it's well worth a bookmark.

https://dougvitale.wordpress.com/2011/12/21/deprecated-linux-networking-commands-and-their-replacements/

As you transition, it's well worth remembering to update any scripts you have to use the new ones. There might come a time when net-tools is completely removed and you'll want to make sure your trusty helper scripts don't fail you!


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

IT Consultancy Services

I'm now available for IT consultancy and software development services - Cloudee LTD.