I’ve been playing around with Docker (more specifically with SQL Server on Linux) the beginning of 2017 to see if indeed I can “Build, Ship, Run, Any App Anywhere” and will be putting together a series of how-to’s and issues experienced so you guys don’t spend the countless hours trying to investigate and work out what is going on when things do not work.
I’ll be adding links to this main page as and when new posts are added as a central point of reference.
I (and many others) have done a series of docker blog posts over the last couple of years but they’ve all tended to evolve around spinning up a SQL Server 2017+ container for testing or demo purposes. This is only really the start, think of the bigger picture here, once you have your database environment the world is your oyster.
This blog post will show how we can use SQL Server 2019 CTP2.1 running on Linux (Ubuntu) in a docker container as our data source for a Power BI environment in next to no time!
These steps show a very manual process for completing this setup, if it is something you are looking to do frequently then I suggest creating a Dockerfile and/or yml file and use docker-compose. This way you can have all your setup in one file and it will be a single statement to get your SQL Server 2019 environment up and running.
Once you have Power BI installed you will have to enable Hyper-V windows feature before installing Docker for Windows. If you don’t, the docker install will ask you if you wish to enable Hyper-V but in my experience this has rarely (ie. never) worked well – could just be me but I play it safe and enable it before installing. Running Linux containers on Windows 10 inevitably means that the Container will run a Isolated / Hyper-V Container which basically runs a cut down, lightweight Hyper-V VM.
Once DfWCE is installed, you will have to modify the Memory settings to be at least 3250MB. By default this is 2048MB but this isn’t enough to run SQL Server On Linux Containers and needs to be amended.
Once configured, we’re now ready to create our SQL Server 2019 CTP2.1 on Ubuntu. As we are going to be creating a container with a database that we can use the data from (ie. AdventureWorks2016CTP3), we’re first going to create a data volume and copy the bak file (from C:\Docker\SQLServer\Backups) to it so it is available to restore:
#Create dummy container to define and copy backup file
docker container create --name dummycontainer -v sqldatavolume:/sqlserver/data/ sqlgeordie/sqlonlinuxdemo:demo
#Check that the container exists
docker ps -a --format $psformat
#Copy AdventureWorks or whatever you like ;)
docker cp C:\Docker\SQLServer\Backups\AdventureWorks2016CTP3.bak dummycontainer:/sqlserver/data/AdventureWorks2016CTP3.bak
#Check what volumes you have
docker volume ls
#Don't need the dummycontainer anymore so get rid
docker rm dummycontainer
NOTE: We have “created” a container but don’t need to run it so the status will indicate that it is “Created” and not the up-time:
You may be asking why I’ve created a datavolume as opposed to just bind mounting my local folder to the container – good question. There are many blog posts out there explaining the pros/cons of both but for me the biggest reason is that mounting a host folder goes against everything Containers are about – portability. The container shouldn’t care about where it is or anything outside of it, bind mounting means that wherever this container is created, you will need that host folder setup and available. Using a data volume means we can reuse this for other containers, even on another host if we were in an environment that has access.
Now we have our data volume with a copy of AdventureWorks2016CTP3.bak, we can proceed to creating the container:
#Create SQLLinuxLocalPersist container
docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=P@ssword1' `
--name SQLLinuxLocalPersist -d `
-p 1433:1433 `
-v sqldatavolume:/sqlserver/data `
#Check it is running
docker ps -a
All going well, your container should be created and running:
The next step is to create the database from the backup, we will do this manually via Azure Data Studio (Other SQL Server Management GUI’s are available).
Simply connect to our locally created container. I have used the default SQL Server port of 1433 so we can use “Localhost”. As windows authentication is not currently supported, we have to use a SQL Login, in this example I’m using “sa”:
Next, we want to restore the DB:
You should now have a restored version of AdventureWorks2016CTP3 ready to use and abuse in Power BI. Simply setup a connection to (again) Localhost, no port mapping is required in this instance but if you had a different port to 1433 you’d have to use something like: Localhost,15105.
Once connected, you can play around with the data til your hearts content 🙂
There we have it, a fully working Power BI demo / sandbox environment ready for you to play around with. The longest aspect of this setup will be the downloading/installing of Power BI Desktop and DfWCE. Once setup, the scripting of creating the container can be done in a minute or two -AMAZING!!!!!
Having recently gone past our 5th birthday, we have decided to make some changes for 2019. Myself and Mike (t) have been extremely busy this year and perhaps not dedicated as much time to SQLNorthEast as we would have liked and have therefore made a decision to bring in additional help with the organising so I would like to announce two new additions to the committee in Glen Leck (t)and Mark Steinberg (t) – I hope they know what they’re letting themselves in for 😉
We are also in the process of re-branding the Usergroup from SQLNorthEast to Newcastle DPaC (Data Platform and Cloud) as we are not just SQL Server and cover most if not all areas of Microsoft’s Data Platform. The PASS website and associated details have also been amended and we have also setup a new Meetup which will be where our communications will be done from now on. Twitter has also been updated so please now use the handle @DPaCNewc.
This means that if you wish to find out the latest news and about the next meetup then please join our Meetup and make sure you follow us on twitter otherwise you may miss out!
A newly created meetup spun off from Newcastle Data Platform and Cloud meetup to help people across the North East get up and running with Power BI by offering a huge range of talks from the very basics to some of the more advanced analytics. So whether you know nothing about Power BI and looking to learn or a seasoned Pro there will be something for everyone.
We will be running these meetup’s bi-monthly initially to gauge interest with an aim to have these running every month at Quorum Business Park.
With 25 new members within a couple of days of setting it up and before we even officially announced it, we are expecting great things from this new Usergroup 🙂
Upcoming and previous speaking engagements. Links to slides and demo’s can always be found on github or youtube. If you would like me to speak at your event whether in person or remotely then please contact me.
A question which always gets asked regarding the union layered file system for images is whether the intermediary image layers can be merged into one? The answer is yes, you can –squash image layers during a build from a Dockerfile.
The attached video shows 2 example builds of the same Dockerfile, one is standard, the second using –squash to merge the layers.
To view the image layers related to the newly created image you can run:
docker history NewImageNameHere
This will show all the layers created from each instruction in the dockerfile but the Imageid is “missing” (only 1 layer / Imageid now exists) and there will be a comment to indicate the merge has occurred and only 1 image layer now exists:
If you are using Docker for Windows and want to switch between Linux or Windows containers you can do this by right clicking the Docker “Whale” in the systray and selecting “Switch to Windows containers”:
….but no one likes clicking around do they!
There is an alternative way to do this which I use in my docker session demo’s which makes things so much easier and the switch is a lot quicker! In powershell, run the script below and it’ll switch from Linux to Windows containers and vice versa:
As with anything open source you’re prone to things changing without you necessarily being aware and the error message in the blog title is something that has been sprung upon me whilst tidying up demo’s for my latest session on Docker and SQL Server On Linux. I hadn’t touched these particular demo’s since October 2017 so as to when things stopped working, I don’t know but they certainly worked the last time I did the session but doing a re-pull of the images, things broke.
With the micorosft/mssql-server-linux images, if you wish to mount a volume to a container then you will experience an error message I’ve never seen in SQL Server:
The transaction log for database ‘master’ is full due to ‘NOTHING’.
Apparently I’m not the only one experiencing this and there is an issue opened (Since October 2017!!!) on GitHub and social.msdn.microsoft.com describing the issue.
I have tested:
and the issue was apparent on both images. The reason as highlighted on GitHub seems to be related to the filesystem, seeing as I’m running Docker for Windows I can’t change my file system from ext3 to ext4 like what others have done.
There is a workaround for this. The issue (certainly for me) only seems to be apparent if you try to mount a volume to a folder that already exists (ie. /var/opt/mssql/data/), mount to a new folder (ie. /sqlserver/data) then it will work:
With the announcement of SQL Server 2017 last week came the updated microsoft/mssql-server-linux and microsoft/mssql-server-windows* docker images.
By default if you do a docker pull without a tag you will pull down the latest however with these new docker images this no longer works:
You get the error:
Error response from daemon: manifest for microsoft/mssql-server-linux:latest not found
I found this a bit strange so went onto the docker hub searching for microsoft/mssql images and see if they actually did exist:
SQL on Linux is what I was after and you can see from the image above it does exist but taking a close look at the tags tab you can see that they have now amended the tags to be microsoft/mssql-server-linux:2017-GA and microsoft/mssql-server-linux:2017-latest.
NOTE: The windows docker images still have the “latest” tag available
Now, if we run docker pull with the 2017-GA or 2017-latest tag it will grab it:
docker pull microsoft/mssql-server-linux:2017-GA
Obviously I have already done this and the image is up to date 🙂