Installing FreeNAS on a HP Microserver

Another post about my installation experience, mainly so that I can look up what issues I came across if I have to do this again. This is on my HP Gen8 Microserver (by now Gen10 is already out).

My old NAS4Free system couldn’t be updated, at least not without getting the server out of the inaccessible corner where it has served away for the last few years.

Handling snapshots was not easy, so I wanted to update, but thought if the server is out anyway I might as well give FreeNAS a chance. I don’t remember why I decided NAS4FREE is better for my purposes when I installed the server originally and I this time round I didn’t spend much time comparing both systems, so might have picked the less optimal system for my purposes.

FreeNAS or NAS4Free

Followed instructions from http://doc.freenas.org/11/install.html

Had to use sudo to get the iso to the USB stick. Copying took around 2 minutes.

Didn’t work first time round, despite following instructions. Tried again, this time with ejecting the disk  in terminal

diskutil eject /dev/disk6

after dd command. Still no joy.

Tried adding the USB stick after graphical boot screen as described in this forum.

Still doesn’t work.

Tried to boot NAS4FREE from USB stick, also didn’t work.

Tried another computer (maybe computer crazy, virus scanner, …)

Didn’t work from second Mac either

Trying to write iso with Rufus on Windows machine. Writing with windows app recommended in FreeNASdoc outdated (Windows 7). Rufus: Works.

Q: Why does writing the iso not work with new versions of Apple’s disk utility, used to work with old one

FreeNAS doesn’t seem to be able to install FreeNAS either because it was the USB stick the machine booted from or because 8GB is too small. Installing on other USB. Worked.

To my surprise FreeNAS was able to import my old ZFS Volumes/Pool and I even had access to all Snapshots!

Deleting old snapshots is much easier than in my outdated version of NAS4Free, e.g. to save space I only keep snapshots form the first of a month, so I can filter all snapshots where the name does not contain 01- (as in auto-20150901-000000).

Setting up rsync

Created user (Account / Add User) (wheel group)

Configured rsync as described at http://doc.freenas.org/11/tasks.html#rsync-module-mode

Configured Hyperbackup in Synology NAS

HDD standby

Set up  HDD standby:

Storage / View Disks / HDD name / edit

time is in minutes

Services / SMART / Power Mode / Standby (check the drive unless it is in SLEEP or STANDBY mode)

Advertisements

Avoid losing old Time Machine backups

Every now and then you get a problem with OS X’s Time Machine function. When that happens the system asks you to start a new backup which results in you losing all previous backups.

It hasn’t happened to me for more than two years or so and even if it does usually I have something in place so that I don’t lose my backups: My NAS backs up to another NAS that can take snapshots, so usually I’d be able to rollback to a previous snapshot of the Time Machine.

Unfortunately, when this happened to me recently I had just deleted the previous snapshots because I was running out of space and didn’t want to buy another hard disk – so I couldn’t roll back to a previous snapshot.

Well, for now I have just created a new Time Machine share, meaning that if I need to go back I can access the old Time Machine in the previous share.

PS: When I had an Apple Time Capsule this problem occurred every few months. With y Synology NAS it only seems to happen every few years.

 

Synology Hyper Backup

In a previous blog post, I described how I set up my Synology NAS to create network backups on another machine using

Synology: Backup & Replication Backup Destination – Create Network Backup Destination Backup – Create Backup and set a schedule

Well, unfortunately, one of the updates from Synology, a few months ago, got rid of this function, at least in the old form, and my NAS stopped backing up to my NAS4FREE machine.

The functionality is however still available. It’s in Synology’s Hyper Backup Package. This package will even remember the old settings from the Backup & Replication Backup.

I thought I will have a lot of work getting updates going again, but luckily I just had to start Hyper Backup.

By the way, Synology does now offer Snapshots, but unfortunately not for my model.

Using Google Analytics with MediaWiki

After upgrading a Wiki to MediaWiki 1.28 I noticed that in AdSense the page views immediately dropped to 20% of the pre-upgrade values.

I am not sure why that is. I have four Hypotheses:

  1. AdSense doesn’t get displayed for all visitors anymore, even though it seems to display fine when I tried it on different desktop and mobile browsers.
  2. After the upgrade ad blockers block the ad for many visitors (but that shouldn’t cause a drop to 20% or the old numbers).
  3. The web gets cached now, so not every page view causes a new ad to be displayed
  4. The upgrade caused the wiki to feature less prominently on Google, so fewer visitors are coming (unlikely as I don’t think the effect would have been that immediate)

There could, of course, be many other reasonsindex.php?title=MediaWiki:Common.js, but everything else I came up with seems even less likely than hypothesis 2 or 4.

Installing Google Analytics

My plan now is to install Google Analytics again to give me a better idea of what is going on, i.e. to see whether the page view numbers in Google Analytics match the ones from Google AdSense.

I did have Google Analytics installed in this wiki about ten years ago, but I took it off again. Of course, I don’t remember at all how I did it, so I thought this time I write down what I did.

I took the tracking code found in Google Analytics at

Admin / Property / Tracking Info / Tracking Code

..it looks like this

(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
...
...
ga('send', 'pageview');

and added it to MediaWiki:Common.js, accessible through the wiki itself, e.g. (index.php?title=MediaWiki:Common.js).

Using Docker to localhost PHP and mysql

If we want to run mysql, too, not only php (as shown here), here’s one way of doing it.

It is a bit more complicated. One reason is that the official php apache image doesn’t come with mysql support. Another reason is that you need to find out the IP address of the mysql container.

This is for developing, not for shipping applications.

Running mysql

docker run --name mymysql -e MYSQL_ROOT_PASSWORD=my-secret-pw -e MYSQL_DATABASE=testdb -e MYSQL_USER=testuser -e MYSQL_PASSWORD=testpw -d mysql:8.0

This time I use names, in this case mymysql. It’s easier to set a name so that  we already know the name of the container we want to link to (instead of using a randomly assigned name).

my-secret-pw is the password for the mysql root user, set with MYSQL_ROOT_PASSWORD. In the same way set the following to whatever you like, if needed

MYSQL_DATABASE=testdb,  MYSQL_USER=testuser, MYSQL_PASSWORD=testpw

You can check at the Docker Hub what mysql versions are available.

Running php

Start php as before, but use

--link mymysql:mysql

in this case mymysql because that’s the name I chose for my mysql.

We can’t use an unmodified image anymore (the official php apache image doesn’t come with mysql support), so we need a Dockerfile or we get a Class ‘mysqli’ not found error.

Create a Dockerfile with the following content

FROM php:7-apache
RUN docker-php-ext-install mysqli

go to folder with Dockerfile, then build, in this case I call is phpwsql

docker build -t phpwsql .

then run

docker run --name myphp70 --link mymysql:mysql -d -p 80:80 -v /pathtofolder/:/var/www/html/ phpwsql

Get the IP address of the mysql container

The mysql and the php containers will have different IP addresses, so localhost won’t work. To get the IP address of the mysql container type

docker inspect mymysql | grep IPAddress

which you can then use in your php code.

 

At the time of writing this blog post both mysql and php-apache are based on debian:jessie, so there’s not much overhead.

Using Docker to localhost PHP

Another reminder for myself (or my students), this time how to use Docker to localhost PHP.

Download Docker from docker.com

Localhost php

In terminal run

docker run -d -p 80:80 -v /pathtofolder/:/var/www/html/ php:7.1-apache

-d for detach, i.e. run in background
-p for publish, i.e. publish container’s port(s) to the host
-v for volume, i.e. bind mount a volume

pathtofolder example: /Users/memm/GitHub/project1/backend/

You can check at the Docker Hub what php versions are available, other than 7.1-apache. Specific versions are on the left, to the right the get less specific, so get get the latest one, but it might break your application.

See all containers

Use

docker ps

to see details of running containers. They tend to have funny names, like gigantic_snyder if you use it as described (i.e. if no name was specified).

Use

docker ps -a

to see all containers, not only running ones.

Remove all stopped containers with

docker rm $(docker ps -a -q)

Stop a container

To stop a container use

docker stop name

name example: gigantic_snyder


The PHP repository on Docker Hub is at https://hub.docker.com/_/php/

Downloading a merging a .ts stream in OS X

Another brain dump to help me remember if I need to do this again.

Install wget

Install and make wget, information taken and updated from another blog.

curl -O http://ftp.gnu.org/gnu/wget/wget-1.18.tar.gz

tar -xzf wget-1.18.tar.gz

cd wget-1.18

./configure --with-ssl=openssl

make

sudo make install

cd .. && rm -rf wget*

Download video

Find file location in Safari’s Page Resources, then use with wget

wget -r http://filelocation/filename_{1..999}.ts

Merge files

cat filename_?.ts filename_??.ts filename_???.ts > all.ts