Flying, Equipment setup & First Plane!

I have been out to the airfield a couple times since my last post.  Nowhere near as much as I would have liked to but you can’t control the weather!

I am gaining more experience and have gotten well beyond the circle pattern I have been flying in both directions.  I moved on to figure eights and have been doing them in both directions.  It’s now where I start when I first get control of the aircraft for the day!  I have been doing boxes the last few times out, which consists of flying in a box formation where you go down the center of the runway do a quick turn, fly out, do another quick turn, fly to the other end of the field, another quick turn, and then you do another quick turn as if you are coming in for a landing and you want to be lined up with the center of the runway and fly right down the middle.  Essentially this is practicing for landing the aircraft.  As I get better with the key turn (coming into the runway) the box will get lower and lower until I am finally landing the plane.

Quick Turn

What I mean by a quick turn is that you bank the aircraft to the inside of your turn, as you pull up on the stick a little giving some up elevator to keep the nose level/up, and then immediately straighten the aircraft so it is level.

This is required so that you are lined up properly with the runway and are level.  If for any reason something doesn’t feel right you abort the landing.  This is why you should always land with fuel in the tank and avoid those dead stick landings!  When you dead stick you just pray you get it right on the first try and go with it as there is no second attempt.

Equipment

I have finally bit the bullet and purchased the gear I require, which consists of the following:

  • Sig Kadet LT 40 ARF
  • 2 O.S. Glow Plugs
  • Glow driver
  • Spektrum DX6 Transmitter w/ AR610 receiver + 1 Free receiver (Gotta love this promotion as then I already have one for my next plane!)
  • Accu-Cycle charger
  • Charger leads
  • Fuel Pump

Thank-you to Peter who had the battery and switch for the plane as well as the servos, better landing gear (both wheels and assembly) along with a stronger push rod for the throttle, more on that in the next post about assembly!

Next Flight

Looks like my next flight is going to be with my very own plane!  More on that to come with the next post.

First flight at the Kenora airfield

Sig Kadet LT 40

Wednesday I finally got out to the Kenora airfield to try my hands with a gas powered trainer.  I got up for a couple flights.  I was on the buddy box while Peter held the master control as I flew counter-clockwise around the perimeter of the flying field.   I did really well for my first time out as not once did Peter have to take control of the aircraft due to a misstep on my part.  I was able to keep the plane at a constant altitude.  One key thing to remember is that as your turning you also need to pull back on the stick a bit too in order to give the aircraft lift as while banking in turns the plane will want to nose down and descend, which you do not want.

I haven’t taken off the aircraft yet; but in talking with Peter one thing I need to remember is that after lift off need to level out the plane a bit (don’t just try to climb and turn right away) to gain some airspeed before climbing and banking to ensure I have enough airspeed to continue the climb instead of stalling and loosing lift sending the aircraft back to earth (what you don’t want, lol).

Flying was a lot of fun and I am excited to get out again.

Holding Sig Kadet LT40
Holding Sig Kadet LT40 that I am learning to fly on.

Kenora Aeromodelers Club Introduction

When I was home the weekend before last I decided it was time to get in touch with my local R/C (Remote Control) flying club since I’m now finally in a position that I can take up a hobby I have been wanting to try and get into for years.  Since the weather didn’t cooperate this past Friday I met Peter (he responded to my email to the club) at King George school.

Our meeting started with introductions to the various members that were present followed by going over some flying/club rules, like taking turns and how to rotate turns while in doors as well as out on the field.  I also went over the radio and how best to hold it, which is such that your hand controlling the stick for ‘steering’ (rudder only in this case) such that you are controlling it with two fingers, your thumb and pointer finger, and can reach the top controls with your other fingers.  Should have taken a picture but I didn’t think to, lol.  For radios the club likes to use Spektrum and Airtronics.

ParkZone Night Vapor
ParkZone Night Vapor

I got to fly a ParkZone Night Vapor around the gym.  I started by taking over once the plane was in the air, flying counter-clockwise circles around the gym taking up as much area as I could.  I eventually got braver and threw in a figure eight or two.   I did have a crash or two where I ended up in a wall, but overall the instructor was impressed by how quickly I picked it up.  I was taking the aircraft off and eventually flying clockwise around the gym to mix it up, which after becoming so comfortable with the other way threw me off a little at first.  The key points I need to remember are:

  • Keep plane in sight (OK fairly obvious)
  • Do not fly over top of my head (or any ones for that mater)
  • Kill the power if you crash (it’s inevitable I suppose and protects as much of the plane as you can)
  • Avoid helicopters as they’ll shred the plane

I was there for a few hours and really enjoyed my first time out with the club here in Kenora.  I am looking forward to getting to the airfield (outdoors) and flying a gas powered trainer once I’m back from vacation in NL.

Rename Files with PowerShell

While on holidays I started scanning in a bunch of pictures for a family project and since this was being done in batches each batch of scanned images would be named childhoodXXXXX.jpg where XXXXX is the number of the current image in sequence.  This would be the same for each batch.  Since wanted a means of knowing which images were part of which batch (and thus the corresponding package they came from) I wrote a little script to rename files in a folder with the following requirements:

  • Prepend batch number to beginning of file name
  • Prepend to files that did not currently start with a digit (since we know all names start with childhood)
  • Script is being run from the directory containing the files
  • I didn’t include a recursive folder structure so all files are in the same directory

The script is as follows:

$num = "3 - "
 
dir | 
%{ 
    $name = $_.Name;
    $c = $name.SubString(0,1);
 
    if(-not [char]::IsNumber($c))
    {
        Rename-Item $_ ($num + $name)
    }; 
}

The script works by getting a list of all files in the current directory, dir, and piping it to a loop which iterates over every file in the folder.  The % is a shorthand/alias for the cmdlet Foreach-Object.  For every file the first character of the name is retrieved and then is tested to see if it is a number, if it is not a number then the Rename-Item cmdlet is used to return the current object/file represented by $_ in the loop to the concatenation of $num and $name, i.e.

childhood0001.jpg would become 3 – childhood0001.jpg

You can make your prefix anything as it is just a string by setting the $num variable.  This means you could automate this further by dynamically changing the prefix within a separate loop or passing it as a parameter to your script.

This post came out of me wanting to play with PowerShell over the holidays and as it worked out this will be my last post of 2014.

Ping Results to Database

I updated my script to write the results to a database to allow for easier manipulation of the data through queries, instead of having to traverse a file, linearly.  The database design is very simple, consisting of only two tables, and can be easily expanded to contain more information about the device.   The bold fields in the tables are required fields.  There is a 1-to-many relationship between the device and the ping results, since each result can only be for one device, but each device can have many results.

Ping Results Database
Ping Results Database

The full script can be seen at the bottom of the post.  The script starts a job running the ping script block 4 times, sleeping for 14 seconds in between.  It is not an exact science here, but the idea is that the script can be scheduled using Windows Task Scheduler to run every minute and will ping all the machine on first starting and about every 15 seconds to make 4 times in a minute.

The script block creates a new SQL Client Connection object and opens a connection to the database, using the following code:

$sqlConnection = new-object System.Data.SqlClient.SqlConnection "server=IT-SERVICES\SQLEXPRESS;database=PingResults;User ID=username;Password=pass;"
$sqlConnection.Open()

Note that the username and password have been changed.

A command object is then created, where the only thing we need to set is the command text (what we want to execute on the SQL server), which in our is a SQL statement to retrieve the device ID and name for only devices that have been marked as active.  We need both as we are pinging the devices by its name and when inserting we need to be able to link the result to the correct device, which as noted above in the database design is reference by the foreign key (FK) DeviceId in the PingResult table.  Once the command text is set we execute the command by calling ExecuteReader().

#Create a command object
$sqlCommand = $sqlConnection.CreateCommand()
$sqlCommand.CommandText = "Select DeviceId, Name from [PingResults].[dbo].[Device] where IsActive=1"
 
#Execute the Command
$sqlReader = $sqlCommand.ExecuteReader()

The returned results are then looped through, pinging each machine in turn and formatting the results into the proper format for a SQL insert statement. Since we cannot insert null into a tinyint column we set the response time to live and response time to -1 if the returned result is null or empty. -1 is chosen since it has no practical meaning in this context (i.e. you can’t have a negative response time).

$sqlInsertQuery += "(" + $deviceId + ",'" + $RunDate +"'," + $status +"," + $timeout + ","+$ttl+","+ $rttl +",'"+$ri+"',"+$RT+"),"

The $sqlInsertQuery is initialized to store the start of the insert SQL statement and then each row to be inserted is formatted and appended to the end of the query, utilizing the line above.  Once all results have been concatenated to the query the loop exits and the trailing comma (,) is trimmed.  The database connection is then closed.  A new connection is opened, setting the command text to the built SQL insert statement, executed, and then closed.  This means for each run of the script block 2 queries are ran against the database.  The first is to retrieve the list of active machines and the second is to do an insert of all the results.

What’s Next?

The next phase of this project is to read the results that have been accumulating in the database to generate the appropriate table and graphs.  Each row of the table is going to consist of the machine name (linking to it’s day, week, month graph webpage), the percentage of down time over the last 120 points (30 minutes) and the average latency over the same interval.   The last recorded latency will be displayed in a column with the font colour indicating whether the device is up (green) or down (red).

The full script:

#Win32_PingStatus class
#http://library.wmifun.net/cimv2/win32_pingstatus.html
#    11001 Buffer Too Small
#    11002 Destination Net Unreachable
#    11003 Destination Host Unreachable
#    11004 Destination Protocol Unreachable
#    11005 Destination Port Unreachable
#    11006 No Resources
#    11007 Bad Option
#    11008 Hardware Error
#    11009 Packet Too Big
#    11010 Request Timed Out
#    11011 Bad Request
#    11012 Bad Route
#    11013 TimeToLive Expired Transit
#    11014 TimeToLive Expired Reassembly
#    11015 Parameter Problem
#    11016 Source Quench
#    11017 Option Too Big
#    11018 Bad Destination
#    11032 Negotiating IPSEC
#    11050 General Failure 
 
$pingScriptBlock = {
 
	$sqlInsertQuery = "INSERT INTO [PingResults].[dbo].[PingResult]
			   ([DeviceId]
			   ,[RunDate]
			   ,[Status]
			   ,[Timeout]
			   ,[TimeToLive]
			   ,[ResponseTTL]
			   ,[ReplyI]
			   ,[ResponseTime])
		 VALUES
			   "
 
	# Connect and run a command using SQL Native Client, Returns a recordset
	# Create and open a database connection
	$sqlConnection = new-object System.Data.SqlClient.SqlConnection "server=IT-SERVICES\SQLEXPRESS;database=PingResults;User ID=username;Password=pass;"
	$sqlConnection.Open()
 
	#Create a command object
	$sqlCommand = $sqlConnection.CreateCommand()
	$sqlCommand.CommandText = "Select DeviceId, Name from [PingResults].[dbo].[Device] where IsActive=1"
 
	#Execute the Command
	$sqlReader = $sqlCommand.ExecuteReader()
 
	#Parse the records
	while ($sqlReader.Read())
	{
		$MachineName = ($sqlReader["Name"].ToString())
		$deviceId = ($sqlReader["DeviceId"].ToString())
		$RunDate = Get-Date
 
		$PingStatus = Gwmi Win32_PingStatus -Filter "Address = '$MachineName'"
		Select-Object StatusCode
		$status = ($PingStatus.StatusCode)
		#Select-Object Timeout
		$timeout = ($PingStatus.Timeout)
		$ttl = ($PingStatus.TimeToLive)
		$rttl = ($PingStatus.ResponseTimeToLive)
		$ri = ($PingStatus.ReplyInconsistency)
		$RT = ($PingStatus.ResponseTime)
 
		#Won't let me insert empty/NULL for the value so going to make these -1 (which has no practical significance since rttl & RT cannot be -'ve)
		if([string]::IsNullOrEmpty($rttl))
		{
			$rttl = -1
		}
		if([string]::IsNullOrEmpty($RT))
		{
			$RT = -1
		}
 
		$sqlInsertQuery += "(" + $deviceId + ",'" + $RunDate +"'," + $status +"," + $timeout + ","+$ttl+","+ $rttl +",'"+$ri+"',"+$RT+"),"
	}
 
	$sqlInsertQuery = $sqlInsertQuery.Trim(',')
 
	# Close the database connection
	$sqlConnection.Close()
 
	# Create and open a database connection
	$sqlConnection = new-object System.Data.SqlClient.SqlConnection "server=IT-SERVICES\SQLEXPRESS;database=PingResults;User ID=username;Password=pass;"
	$sqlConnection.Open()
 
	#Create a command object
	$sqlCommand = $sqlConnection.CreateCommand()
	$sqlCommand.CommandText = $sqlInsertQuery
 
	#Execute the Command
	$sqlReader = $sqlCommand.ExecuteReader()
 
	# Close the database connection
	$sqlConnection.Close()
}
 
# 
Start-Job $pingScriptBlock
Start-Sleep -s 14
Start-Job $pingScriptBlock
Start-Sleep -s 14
Start-Job $pingScriptBlock
Start-Sleep -s 14
Start-Job $pingScriptBlock
 
Write-Host "Waiting for jobs to complete"
While(Get-Job -State "Running") { Start-Sleep 2 }
 
# Display output from all jobs
Get-Job | Receive-Job
 
# Cleanup
Remove-Job *

Ping Results with CSV

This is a project that came out of a wish list from a coworker to monitor,  at a glance, if something is up or down as well as have a bit of a history to analyze for any slow or dropped pings, hence the graph.  The first phase (script) ping.ps1 for performing the ping tests is from an external source that my coworker found and then modified by the two of us to get what we want, the second phase of the process is GeneratePingCharts.ps1 written by me to generate the graphs from the output of the first script using RGraph which is an HTML 5 charts library.

GeneratePingCharts.ps1 output sampleping.ps1 output sample
Ping Results GraphsPing Results HTM

GeneratePingCharts.ps1

I am writing about this one first since I wrote it from scratch. The script calculates the ping response time averages for month, week, & day utilizing the data collected and saved in a CSV file and outputs the graphs and averages to an HTML file. For flexibility I allow the folder containing the CSV files, where the HTML should be outputted, and the location of the RGraph library files to be passed as parameters to the script.

The script starts by outputting the CSS utilized and then gathers the list of files containing the CSV extension from the folder path passed in.  For each file found a new job is started, with a maximum of 10 jobs running at a time, to process the CSV file.  The script waits for all 10 jobs to finish  and then removes the jobs prior to starting 10 more jobs.  This process continues until all files have been handled.

The line which starts the job by running the script block and passing the parameters is:

Start-Job $ProcessCSVScriptBlock -ArgumentList $file.FullName,$outputPath,$rGraphJS

The script block then initiates the processing of the CSV file by calling the Process-CSV function.  The CSV file is imported into a variable using the Import-CSV PowerShell cmdlet followed by each row being processed.  The sum of all the response times and number of response times collected is tracked to calculate the averages.  The response times are stored in a string with the format [a,b,c,….,z] where a-z is the response time value and will contain as many as there are stored in the CSV for the given time frame.  The only exception to this is month, which stores the average response time over $mInc points; otherwise, the graph would not draw do to the large quantity of points.  All this data is then stored into a custom object which is passed to the Generate-Chart function.

The Generate-Chart function stores the HTML for the graph page in a variable inserting the appropriate information stored in the $chartOut parameter in with the HTML string so that the:

  • Machine name and run time are displayed at the top
  • Averages are displayed in a table
  • Day, week, and month graphs contain the appropriate data points.

Worth mentioning is that the Get-My-Date function is used to convert the date stored in an unsupported format (MM_DD_YYYY) to a supported format (YYYY-MM-DD) so that within the Process-CSV function the date can be compared against today’s date allowing the script to know which graphs the ping result belong too.

<#
	.SYNOPSIS
		Calculates the ping response time averages for month, week, & day utilizing the data collected and saved in a CSV file by the script ping.ps1
		Creates a graph based on RGraph functionality and outputs all results to an HTML file.
	.PARAMETER csvFolderPath
		Where the CSVs are located and must all be in the same directory
	.PARAMETER outputPath
		Where the generated graphs should be saved
	.PARAMETER rGraphJS
		Where the JavaScript files are for RGraph used to generate the charts, relative to outputPath
	.EXAMPLE
		.\GeneratePingCharts.ps1 -csvFolderPath "C:\scripts\RESULTS" -outputPath "C:\Program Files (x86)\Lansweeper\Website\PingGraphs" -rGraphJS "../js"
#>
 
[CmdletBinding()]
 Param (
	[Parameter(Mandatory=$True)]
	[string] $csvFolderPath,
	[Parameter(Mandatory=$True)]
	[string] $outputPath,
	[Parameter(Mandatory=$True)]
	[string] $rGraphJS
)
 
$ProcessCSVScriptBlock = {
	param($fName, $op, $jsPath)
 
	<#
		Converts the date format used in the file to what Get-Date needs so we can do our comparisons
	#>
	function Get-My-Date ($date)
	{
		$parts = $date.Split('_')
 
		# year-month-day
		$df = $parts[2] + "-" + $parts[0] + "-" + $parts[1]
 
		return Get-Date $df
	}
 
	# Need to test
	function Generate-Chart
	{
		Param($chartOut,$op,$jsPath)
 
		$today = Get-Date
		$html = '<html>
				<head>
					<link rel="stylesheet" href="style.css" type="text/css" media="screen" />
 
					<script src="'+ $jsPath +'/RGraph.common.core.js" ></script>
					<script src="'+ $jsPath +'/RGraph.common.dynamic.js" ></script>
					<script src="'+ $jsPath +'/RGraph.common.tooltips.js" ></script>
					<script src="'+ $jsPath +'/RGraph.line.js" ></script>
					<script src="'+ $jsPath +'/jquery.min.js" ></script> 
					<!--[if lt IE 9]><script src="'+ $jsPath +'/excanvas.js"></script><![endif]-->
 
					<title>' + $chartOut.MachineName + '</title>
				</head>
				<body>
 
					<h3>' + $chartOut.MachineName + '</h3>
					<h5>Generated on ' + $today + '</h5>
					<table border="1">
						<tr>
							<th colspan="3">Response Time Averages (ms)</th>
						</tr>
						<tr>
							<th>Month</th>
							<th>Week</th>
							<th>Day</th>
						</tr>
						<tr>
							<td align="left">' + $chartOut.mAvg + '</td>
							<td align="center">' + $chartOut.wAvg + '</td>
							<td align="right">' + $chartOut.dAvg + '</td>
						</tr>
					</table>
					<hr>
					<h5>The Past Day</h5>
					<canvas id="Day" width="2000" height="300">[No canvas support]</canvas>
					<br/>
					<h5>The Past Week</h5>
					<canvas id="Week" width="2000" height="500">[No canvas support]</canvas>
					<h5>The Past Month</h5>
					<canvas id="Month" width="2000" height="500">[No canvas support]</canvas>
 
					<script>
						$(document).ready(function ()
						{
							var line = new RGraph.Line({
								id: ''Week'',
								data: ' + $chartOut.wData + ',
								options: {
									tooltips: ' + $chartOut.wDataToolTips + '
								}
							}).draw()
						})
					</script>
					<script>
						$(document).ready(function ()
						{
							var line = new RGraph.Line({
								id: ''Day'',
								data: ' + $chartOut.dData + ',
								options: {
									tooltips: ' + $chartOut.dDataToolTips + '
								}								
							}).draw()
						})
					</script>
					<script>
						$(document).ready(function ()
						{
							var line = new RGraph.Line({
								id: ''Month'',
								data: ' + $chartOut.mData + ',
								options: {
									tooltips: ' + $chartOut.mDataToolTips + '
								}	
							}).draw()
						})
					</script>
				</body>
				</html>'
		$html | Out-File -FilePath $($op + "\" +$chartOut.MachineName + ".html")
	}
 
	<#
		Reads a csv file in the appropriate format and calculates the last month, week, day response times (RT) as well as every date and response time.
 
		$file is the name of the file to process
	#>
	function Process-CSV
	{
		Param($file,$op,$jsPath)
 
		# Arrays for our chart data
		$mChartData = "["
		$wChartData = "["
		$dChartData = "["
		$mDataToolTips = "["
		$wDataToolTips = "["
		$dDataToolTips = "["
 
		$today = Get-Date
		$monthAgo = $today.AddMonths(-1)
		$weekAgo = $today.AddDays(-7)
		$mSum = 0
		$mPoints = 0
		$wSum = 0
		$wPoints = 0
		$dSum = 0
		$dPoints = 0
		$mIncSum = 0
		$mInc = 3 # How many points in month data to average to 1 point
 
		$csv = Import-CSV -Header MachineName,UpDown,RunDate,PingTime,Status,Timeout,TTL,RTTL,ReI,RT $file
 
		foreach($row in $csv)
		{
			# Array variable containing month, day, year
			$rDate = Get-My-Date $row.RunDate
 
			# Add down pings as empty, not included in sum/points for avg's
			if($row.UpDown -eq "down")
			{ #Need to figure out how to move the x-axis to allow negatives
				$mChartData += ","
				$wChartData += ","
				$dChartData += ","
				$mDataToolTips += "'" + $row.RunDate + ": " + $row.RT + "',"
				$wDataToolTips += "'" + $row.RunDate + ": " + $row.RT + "',"
				$dDataToolTips += "'" + $row.RunDate + ": " + $row.RT + "',"
			}
			else # Use the values in the file
			{
				# Handle month ago data
				if($monthAgo -lt $rDate)
				{
					$mSum += [int]$row.RT
					$mIncSum += [int]$row.RT
					$mPoints += 1
 
					if($mPoints % $mInc -eq 0)
					{
						$avg = ([double]$mIncSum / $mInc)
						$mChartData += [String]$avg + ","
						$mDataToolTips += "'" + $row.RunDate + ": " + [String]$avg + "',"
						$mIncSum = 0
					}
				}
 
				# Handle week ago data
				if($weekAgo -lt $rDate)
				{
					$wSum += [int]$row.RT
					$wPoints += 1
 
					$wChartData += $row.RT + ","
					$wDataToolTips += "'" + $row.RunDate + ": " + $row.RT + "',"
				}
 
				# Handle day ago data
				if(($today - $rDate).TotalHours -lt 24)
				{
					$dSum += [int]$row.RT
					$dPoints += 1
 
					$dChartData += $row.RT + ","
					$dDataToolTips += "'" + $row.RunDate + ": " + $row.RT + "',"
				}
			}
		}
 
		$mChartData = $mChartData.Trim(',') + "]"
		$wChartData = $wChartData.Trim(',') + "]"
		$dChartData = $dChartData.Trim(',') + "]"
		$mDataToolTips = $mDataToolTips.Trim(',') + "]"
		$wDataToolTips = $wDataToolTips.Trim(',') + "]"
		$dDataToolTips = $dDataToolTips.Trim(',') + "]"
 
		# Commented out wChartData & dChartData as mChartData contains all points and then can just use appropriate sections when generating the charts
		$chartData = New-Object System.Object
		$chartData | Add-Member -Type NoteProperty -Name mAvg -Value ([double]$mSum / $mPoints)
		$chartData | Add-Member -Type NoteProperty -Name wAvg -Value ([double]$wSum / $wPoints)
		$chartData | Add-Member -Type NoteProperty -Name dAvg -Value ([double]$dSum / $dPoints)
		$chartData | Add-Member -Type NoteProperty -Name mData -Value $mChartData
		$chartData | Add-Member -Type NoteProperty -Name wData -Value $wChartData
		$chartData | Add-Member -Type NoteProperty -Name dData -Value $dChartData
		$chartData | Add-Member -Type NoteProperty -Name mDataToolTips -Value $mDataToolTips
		$chartData | Add-Member -Type NoteProperty -Name wDataToolTips -Value $wDataToolTips
		$chartData | Add-Member -Type NoteProperty -Name dDataToolTips -Value $dDataToolTips
		$chartData | Add-Member -Type NoteProperty -Name MachineName -Value $csv[0].MachineName
 
		Generate-Chart $chartData $op $jsPath
	}
 
	# Do what you need to do
	Process-CSV $fName $op $jsPath
 
	# Just wait for a bit...
	Start-Sleep 5
}
 
# Entry point for script
#CSS file only needs to be done once so do it here
$css = 'body {
	font-family: Arial;
}
 
pre.code {
	padding: 5px;
	background-color: #eee;
	border: 2px dashed gray
}'
$css | Out-File -FilePath $($outputPath + "\style.css")
 
$csvFiles = Get-ChildItem $csvFolderPath -Filter *.csv
$count = 0 
Foreach($file in $csvFiles)
{
	$count += 1
	Start-Job $ProcessCSVScriptBlock -ArgumentList $file.FullName,$outputPath,$rGraphJS
 
	if($count % 10 -eq 0)
	{
		# Wait for 10 jobs to complete
		While(Get-Job -State "Running") { Start-Sleep 2 }
 
		# Display output from all jobs
		Get-Job | Receive-Job
 
		# Cleanup
		Remove-Job *
	}
}
 
Write-Host "Out of For waiting for last jobs"
# Wait for last jobs to complete
While(Get-Job -State "Running") { Start-Sleep 2 }
 
# Display output from all jobs
Get-Job | Receive-Job
 
# Cleanup
Remove-Job *

ping.ps1

The script reads a list of machine names listed one per line in a text file (named servers.txt below) and then loops through every machine and rights the results of the ping to a CSV file as well as the up results to a file with the appropriate HTML to append to the down HTML so that all the downs are displayed at the top.  The script also writes the machine name, state (up/down), run date, ping time, status code, timeout, time to live (ttl), response time to live (RTTL), reply inconsistency (ri), and response time (RT) to a CSV file for historical purposes as well as to feed the second script, detailed after the ping.ps1 code.  This script is presently being run as a scheduled task every 1 minute.

# Red = #FF0000
# Green = #00FF00
# Blue = #0000FF
# Cyan (blue and green) = #00FFFF
# Magenta (red and blue) = #FF00FF
# Yellow (red and green) = #FFFF00
 
#Win32_PingStatus class
#http://library.wmifun.net/cimv2/win32_pingstatus.html
#    11001 Buffer Too Small
#    11002 Destination Net Unreachable
#    11003 Destination Host Unreachable
#    11004 Destination Protocol Unreachable
#    11005 Destination Port Unreachable
#    11006 No Resources
#    11007 Bad Option
#    11008 Hardware Error
#    11009 Packet Too Big
#    11010 Request Timed Out
#    11011 Bad Request
#    11012 Bad Route
#    11013 TimeToLive Expired Transit
#    11014 TimeToLive Expired Reassembly
#    11015 Parameter Problem
#    11016 Source Quench
#    11017 Option Too Big
#    11018 Bad Destination
#    11032 Negotiating IPSEC
#    11050 General Failure 
#		is in use on other pages:	background-color  #DCDCDC
 
# <--------------- Start script ------------------------------------------------->
clear
#ipconfig /flushdns
$pingResults =("C:\Program Files (x86)\Lansweeper\Website\PingResults.HTM") # <-- you need to change this
$up =("C:\scripts\up.txt") # <-- you need to change this
$RunDate = (get-date).tostring("MM_dd_yyyy")
$PingTime = (Get-Date -format 'hh:mm')
$PingMachines = Gc "C:\scripts\servers.txt"
$n=(6)
#Write the preamble of the report
#clear-content -Path $pingResults
clear-content -Path $up
$htmlContent += "<head><meta http-equiv='refresh' content='15' ><p>"
$htmlContent += "<title> Ping Results </title>"
$htmlContent += "</head><body bgcolor='#DCDCDC'>"
$htmlContent += "<h3><p align='center'>Report Generated " + $RunDate + "
@ " + $PingTime + "</p></h3>"
 
$htmlContent += "<table border='1' align='center' style='width:50%'>"
 
ForEach($MachineName In $PingMachines)
{$PingStatus = Gwmi Win32_PingStatus -Filter "Address =
'$MachineName'"
Select-Object StatusCode
$status = ($PingStatus.StatusCode)
#Select-Object Timeout
$timeout = ($PingStatus.Timeout)
$ttl = ($PingStatus.TimeToLive)
$rttl = ($PingStatus.ResponseTimeToLive)
$ri = ($PingStatus.ReplyInconsistency)
$RT = ($PingStatus.ResponseTime)
If ($PingStatus.StatusCode -eq 0)
{
	Add-Content -Path $up ("<tr><pre><h6><td>"  + $MachineName +"</td><td><FONT color=#00FF00>`tUP</FONT></td><td><a href='./PingGraphs/" + $MachineName + ".html' target='_blank'>Graph</a></td></h1></pre></tr>")
}
Else
{
    $htmlContent += "<tr><pre><h6><td>" + $MachineName + "</td><td><FONT color =#FF0000>`tDOWN</FONT></td><td><a href='./PingGraphs/" + $MachineName + ".html' target='_blank'>Graph</a></td></h6></pre></tr>"
}
# send to csv file everything...
If ($PingStatus.StatusCode -eq 0)
{Add-Content "c:\scripts\RESULTS\$MachineName.csv" ($MachineName + ",up,"+ $RunDate +","+ $PingTime+","+ $status +","+$timeout+","+$ttl+","+ $RTTL +","+$ri+","+$RT)}
Else
{Add-Content "c:\scripts\RESULTS\$MachineName.csv" ($MachineName + ",down," + $RunDate +","+ $PingTime+","+ $status +","+$timeout+","+$ttl+","+ $RTTL +","+$ri+","+$RT)}
# $MachineName to $csv to put all results in one file
}
#put all Up results at end of file
$data = (get-content $up)
$htmlContent += $data + "</table>"
#Need to close the syntax of the HTML properly
$htmlContent += "</body></html>"
clear-content -Path $pingResults
$htmlContent | Out-File $pingResults

 

What’s Next?

The next phase of this project is to get away from the CSV files, as they are going to get extremely bloated.  We want to run the ping.ps1 script about every 15 seconds, which will generate 4 times the amount of data.  So what I want to do is break out the ping.ps1 script into two separate scripts.  The first will be used to perform the actual pings, writing the results to a database, and then the second will read the results from the database to generate the HTML table of up/down results; which will also before formatted differently and contain some additional information.  The GeneratePingCharts.ps1 script will then be modified to obtain the results from the database instead of the CSV file.

Portfolio & My Story

I have made some major overhauls to my website over the past month to a point where I can deem things acceptable.  The latest updates include finding a portfolio implementation I like and getting an initial start on my story, where I have a start on the academics section but am still trying to figure out what to write and how to word parts of it.  I have a solid section for my athletics to give a comprehensive overview.  I’m not really sure if I need to give any more details like injuries, additional mini-stories, etc.

The plugin I came across which give me the best look and feel for what I am going for is the Aeolus Portfolio WP Plugin.  This wasn’t as simple as install and configure.  The latest version, as of this writing, is 1.8 which once installed and activated did not work properly with my theme.  The issue I was having is that all the content would become ‘minified’ and illegible due to the plugin altering the display of the theme.  My best guess is that this is due to the Bootstrap update for the plugin.

In order to get around this I removed the plugin I installed through the WordPress Add Plugin feature and grabbed version 1.7 of the Aeolus Creative Portfolio plugin, unzipped the file and uploaded the folder to my WordPress installation plugins directory.  I was then able to go to the Plugins section on the dashboard within WordPress and activate version 1.7 of the plugin.  This did not cause any negative effects with my theme so I continued on and configured my Portfolio page to display all portfolio items with pagination similar to a parallax effect since it gives the short description for items.  I’m not sure at the moment how well this is going to work long term as it doesn’t appear to show categories; however, you can filter by category so I may be able to do one of the following once there are a lot of items:

  • Add multiple portfolio short codes to one page filtered by category and then add a top navigation manually  with anchor tags / headings to the various sections
  • Add individual pages for each portfolio category as subpages of Portfolio and the root page would be either like the first bullet point or links with explanations to the sub pages

Now I just have to build up my portfolio.  This all started with a portfolio being a requirement for a job interview I had while finishing up my Masters.  This is why the only portfolio items so far are for my academic projects.  Presently I have no work projects I am allowed to divulge (or for that matter have access to anymore) from past or present positions.

In the case of personal projects the portfolio items will appear upon completion of the first major version (1.0) and will be updated with any major features that get added as well as link to the corresponding category containing all posts pertaining to the project.

Transition to Quark Theme

I have been working with the Revera theme for about 4 months now I have discovered that the theme does not work for me as well as I would have liked.  This is mostly due to the main page slider and needing a reasonably sized images, where ideally they are all the same to prevent annoying page movements, and since I’m not running a photo based blog the theme does not make sense for me.

I have been casually looking around for a replacement theme that contained a clean look, is responsive, and customizable.  I came across Quark and like how the theme is built upon HTML5 and CSS3 and according to some of the reviews the code isn’t bloated and the theme is easy to customize, which I am going to venture into over the next week.

I have already started some basic, but key, code level modifications by adding a modification of the default full page template to include one that does not allow for comments as I do not want to allow users to comment on pages.   The one thing this theme doesn’t have that I need is a portfolio section, which leads me into the key elements I need to complete to get this theme working for me:

  • Portfolio template to allow for a section to showcase my applications
  • Full page template without comments (done)
  • Front page and footer widget areas the theme allows you to set
  • Front page display in general
  • Logo for my website
    • Main logo with website name and potentially caption
    • Favicon

I think this theme has the potential to work nicely for me so long as a theme update does not break my code level customizations which at this point appears to be the addition of two new page templates.  I am going to do further reading on the theme to learn more about what Quark has to offer.  The theme does not allow you to bookmark posts based on the short links (with the ?p=#) so I may need to find a way to add that too, but just encase I made the modification now to change the permalink structure for blog posts to /blog/%postname%/, which unfortunately will break any current links that are not short links.

Separation of Concerns: View Model & Database Model

I noticed how when I used scaffolding to generate my MVC5 controller with views using Entity Framework this would only allow me to specify a model representing a table in my database, which is not what I want.  In order to have true separation of concerns I wanted to:

  • Use view models for my views that get validated prior to any database updates
  • Use database models, auto generated, to represent my database and contain the results of queries

I accomplished this by implementing various services which handle the communication with the database.  The controller contains an instance of the service it requires to communicate with the database.  The service queries the database and returns the results to the controller, which then sends the information to the view utilizing the appropriate format.

When querying (performing  select) the database:

  • Resulting database model(s) are converted to the corresponding view model(s), and
  • Returned to the controller, where
  • They are passed through to the view for display

When updating an entry:

  • View Model is passed from the view to the controller
  • Validated using an appropriate validator, I built with Fluent Validation
  • If invalid, error message is returned to the view and displayed
  • If valid, view model is passed to the service to update the corresponding tables in the database
  • In either case if errors occur then the model state is updated and the default view for the action is returned

An example of this is how I manage reading lists.  I have my reading list controller (ReadingListController class) which handles user interactions and manipulates the model.  You can see below that the controller class contains minimal logic and utilizes helper classes to perform the logic. A validator class is used for validating data being passed from the view, when creating/editing a reading list, using Fluent Validation.  This ensures valid data is being entered into the database rather than relying on exceptions being returned on insert/updating of the database due to ill formatted values.

To provide separation of concerns services are used to interact with the database model.  Utilizing a service also means that the controller does not need to know the implementation of the service and how to interact with the database so by extending the particular service interface you can change the implementation, i.e. if I decided to not utilize the Entity Framework anymore and instead use NHibernate.

In the case of reading lists two services are used, the reading list service and the author service. The author service is utilized here so that we do not duplicate code for retrieving authors who have books and specific books for that author based on selection.  I am still working on the view implementation of this.  The reading list service contains the implementation for retrieving:

  • List of reading lists for a particular user (List title, number of novels in the list)
  • A specific reading list (for editing/deleting)
  • Book information for a book in a particular reading list
  • Details for a reading list (Book titles, year, whether they are read, and date finished)

Also an implementation for editing, creating, and deleting a reading list.

[Authorize]
public class ReadingListController : Controller
{
	private readonly IReadingListService _service;
	private readonly IAuthorService _aService;
 
	public ReadingListController() : this(new ReadingListService()) { }
 
	public ReadingListController(IReadingListService service)
	{
		_service = service;
	}
 
	// GET: ReadingList
	public ActionResult Index()
	{
		int userId = GetUserId();
 
		var readingLists = _service.GetReadingListsForUser(userId);
 
		if(readingLists == null)
		{
			ModelState.AddModelError("", "An error occurred in retrieving your reading lists.");
			return View();
		}
 
		return View(readingLists);
	}
 
	// GET: ReadingList/Details/5
	public ActionResult Details(int? id)
	{
		if (id == null)
		{
			return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
		}
 
		var readingListDetails = _service.GetReadingListDetails(id ?? 0);
 
		if (readingListDetails == null)
		{
			return HttpNotFound();
		}
 
		return View(readingListDetails);
	}
 
	// GET: ReadingList/Create
	public ActionResult Create()
	{
		return View();
	}
 
	// POST: ReadingList/Create
	// To protect from overposting attacks, please enable the specific properties you want to bind to, for 
	// more details see http://go.microsoft.com/fwlink/?LinkId=317598.
	[HttpPost]
	[ValidateAntiForgeryToken]
	public ActionResult Create([Bind(Include = "ListTitle")] ReadingListModel readingListModel)
	{
		readingListModel.UserId = GetUserId();
		var rlValidator = new ReadingListValidator();
		var results = rlValidator.Validate(readingListModel);
 
		if (results.IsValid)
		{
			var success = _service.CreateReadingList(readingListModel);
 
			if (!success)
				return RedirectToAction("Failed", new FailedModel { Message = "Unable to create reading list, please try again.", Action = "Index" });
 
			return RedirectToAction("Index");
		}
 
		return View(readingListModel);
	}
 
	public ActionResult Failed(FailedModel fm)
	{
		return View(fm);
	}
 
	// GET: ReadingList/Edit/5
	public ActionResult Edit(int? id)
	{
		if (id == null)
		{
			return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
		}
 
		var readingList = _service.GetReadingList((int)id);
 
		if(readingList != null)
			readingList.UserId = GetUserId();
 
		//ViewBag.UserID = new SelectList(db.Users, "UserID", "Username", readingList.UserID);
		return View(readingList);
	}
 
	// POST: ReadingList/Edit/5
	// To protect from overposting attacks, please enable the specific properties you want to bind to, for 
	// more details see http://go.microsoft.com/fwlink/?LinkId=317598.
	[HttpPost]
	[ValidateAntiForgeryToken]
	public ActionResult Edit([Bind(Include = "ReadingListID,ListTitle,UserID")] ReadingListModel readingListModel)
	{
		var rlValidator = new ReadingListValidator();
		var results = rlValidator.Validate(readingListModel);
 
		if (results.IsValid)
		{
			var success = _service.EditReadingList(readingListModel);
 
			if (!success)
				return RedirectToAction("Failed", new FailedModel { Message = "Unable to edit reading list, please try again.", Action = "Index" });
 
			return RedirectToAction("Index");
		}
 
		return View(readingListModel);
	}
 
	// GET: ReadingList/Delete/5
	public ActionResult Delete(int? id)
	{
		if (id == null)
		{
			return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
		}
 
		var rlm = _service.GetReadingList((int)id);
 
		if (rlm == null)
		{
			return HttpNotFound();
		}
 
		return View(rlm);
	}
 
	// POST: ReadingList/Delete/5
	[HttpPost, ActionName("Delete")]
	[ValidateAntiForgeryToken]
	public ActionResult DeleteConfirmed(int id)
	{
		_service.DeleteReadingList(id);
		return RedirectToAction("Index");
	}
 
	//Get: ReadingList/DeleteBook/5/4
	public ActionResult DeleteBook(int? readingListId, int? bookId, string bookTitle)
	{
		if (readingListId == null || bookId == null)
		{
			return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
		}
 
		var bm = _service.GetReadingListBookInfo((int)bookId, (int)readingListId);
 
		if (bm == null)
		{
			return HttpNotFound();
		}
 
		return View(bm);
	}
 
	// POST: ReadingList/DeleteBook/5
	[HttpPost, ActionName("DeleteBook")]
	[ValidateAntiForgeryToken]
	public ActionResult DeleteBookConfirmed(int readingListId, int bookId)
	{
		_service.DeleteBookFromList(bookId, readingListId);
		return RedirectToAction("Details", new { id = readingListId });
	}
 
	// Get: ReadingLis/AddBook/5
	public ActionResult AddBook(int? readingListId, string rlTitle)
	{
		if(readingListId == null)
		{
			return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
		}
 
		var rlam = new ReadingListAuthorsModel
		{
			Authors = _aService.GetAuthorsWithBooks().ToList(),
			ListTitle = rlTitle,
			ReadingListId = (int)readingListId
		};
 
		return View(rlam);
	}
 
	public ActionResult Books(int authorId)
	{
		var books = _aService.GetBooksForAuthor(authorId);
 
		return Json(books, JsonRequestBehavior.AllowGet);
	}
 
	//Get: ReadingList/MoveBook/5/4
	public ActionResult MoveBook(int? readingListId, int? bookId, string bookTitle)
	{
		if (readingListId == null || bookId == null)
		{
			return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
		}
 
		var bm = _service.GetReadingListBookInfo((int)bookId, (int)readingListId);
		var usersReadingLists = _service.GetReadingListsForUser(GetUserId()).Where(x => x.ReadingListId != readingListId);  //Can't move book to itself
 
		if (bm == null || usersReadingLists == null)
		{
			return HttpNotFound();
		}
 
		bm.ReadingLists.ToList().AddRange(usersReadingLists);
 
		return View(bm);
	}
 
	private int GetUserId()
	{
		int userId;
		userId = int.TryParse(User.Identity.Name.Split(':')[0], out userId) ? userId : 0;
		return userId;
	}
}

Validation for creating a new reading list is shown below.  I chose this example since it is a very simplistic validator, to implement.  The only criteria is that the list title has to be 1-255 characters long and the returned result for the number of novels in a list should be greater than or equal to zero, the highlighted rows.

using FluentValidation;
using tracker.Models.View.ReadingList;
 
namespace tracker.Validators.Novels
{
    public class ReadingListValidator : AbstractValidator<ReadingListModel>
    {
        public ReadingListValidator()
        {
            RuleFor(a => a.ListTitle).Length(1, 255).WithMessage("List name needs to be between 1 and 255 characters.");            RuleFor(a => a.NumberOfNovels).GreaterThanOrEqualTo(0).WithMessage("Number of novels was less than 0, please contact admin.");        }
    }
}

In order to utilize my view model instead of the database model for each view I changed the @model parameter to be my view model instead of the database model.   This ensures that the model is populated within the database session prior to being passed back to the view.  An example is the details view for a reading list:

@model tracker.Models.View.ReadingList.ReadingListDetailsModel