Thursday 27 November 2008

iPhoto Script to Tag Duplicates

I merged a whole lot of photo folders and albums recently and ended up with hundreds of duplicates. Not wanting to manually clean up the mess I searched for something that would help me out. I eventually found a script from Karl.

It inspired me to develop the idea a bit more. I wanted it to tag exact duplicate photos, photos that should be duplicates but are not, and photos that have been processed in some way. I also added some dialogs to remind you how to use it.

To use it, open iPhoto, select some or all of your photos and run the script.
I have also written a short script to remove the "duplicate", "similar" and "processed" comment tags so that you can 'reset' everything.

Note, this script only tags photos. It will not delete any photo (although it could be altered to do that if you wanted it to).

This is the AppleScript code. Copy and paste it into a file called pcIPhotoMarkDuplicates.scpt.

(*
iPhoto Mark Duplicates

Based on work by Karl Smith
http://blog.spoolz.com/2008/11/03/iphoto-applescript-to-remove-duplicates/
Copyright 2008 Phil Colbourn

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see
.
*)
tell application "iPhoto"
display alert "This script will tag photos that are
* duplicates - identical,
* similar - dates and sizes match but somehow different, and
* processed - dates match but smaller than the 'original'.

WARNING: This script is only effective if the selected photos are in date order. Please ensure that the photos are sorted by selecting the

View - Sort Photos - By Date, Ascending

from the iPhoto menu."
set curPhotos to selection
if (count of curPhotos) ≤ 1 then
display alert "You need to select the photos you want me to process."
else
-- this assumes the selected list of photos is in date order
set lastPhoto to item 1 of curPhotos
repeat with thisPhoto in rest of curPhotos
-- skip tagged duplicates
if comment of thisPhoto = "duplicate" then
else
set dupFound to false
try
if (date of thisPhoto = date of lastPhoto) and (width of thisPhoto = width of lastPhoto) and (height of thisPhoto = height of lastPhoto) then

set thisSize to size of (info for (image path of thisPhoto as POSIX file))
set lastSize to size of (info for (image path of lastPhoto as POSIX file))

if thisSize = lastSize then
set diff to "anything but empty"
try
-- run the unix diff program to compare the files.
-- if they are the same the variable diff will be empty.
-- if they are different or an error occurs then diff will not be empty.
set diff to (do shell script "/usr/bin/diff -q '" & (image path of thisPhoto as text) & "' '" & (image path of lastPhoto as text) & "'")
end try

if diff = "" then
set comment of thisPhoto to "duplicate"
set dupFound to true
else
-- there must be subtle changes so I will mark thisPhoto
--set comment of lastPhoto to "similar" -- for testing
set comment of thisPhoto to "similar"
set dupFound to true
end if
else
-- here I assume that the larger file has more information
-- and therefore it is the original
if lastSize > thisSize then
set comment of thisPhoto to "processed"
set dupFound to true
else
set comment of lastPhoto to "processed"
-- thisPhoto is assumed to be the original
end if
end if
end if
end try
-- Last=This keep using Last
-- Last~This keep using Last
-- Last>This keep using Last - Last is assumed to be he original but the next could be the original
-- Last
<>This step onto This
if not dupFound then
set lastPhoto to thisPhoto
end if
end if
end repeat

beep
beep
display alert "All duplicate, similar and processed photos have been marked.

Switch to the Photos Library and search for one of these keywords: duplicate, similar or processed.

Then delete the photos you do not want.

NOTE: If you do nothing then no photos will be harmed.

(I will now try to switch you to the Photo Library.)"
set current album to photo library album
end if
end tell


This is the AppleScript code. Copy and paste it into a file called pcIPhotoClearDuplicateComments.scpt.

(*
iPhoto Clear Duplicate Comments

Copyright 2008 Phil Colbourn

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see .
*)
-- clear comment field on all selected photos if it is duplicate, similar or processed
tell application "iPhoto"
set current album to photo library album
repeat with thisPhoto in (photos of current album)
if (comment of thisPhoto) is in {"duplicate", "similar", "processed"} then
set comment of thisPhoto to ""
end if
end repeat
end tell

Thursday 17 July 2008

Human Readable Compact Extensible Data Interchange Format

When working with Lotus Domino databases I needed a simple and compact way to pull data from one database into another.

I wanted something that was human-readable, extensible, simple, order-agnostic and simple to process. After some experimentation I settled on an old idea to use delimited fields. But, I wanted to be able to add and remove fields and have them in any order. To do this I added a field name.

The format is

Compact = Field* ~

Field = ~ fieldname : data

The only constraints were:

  1. The field name could not start with a "~"
  2. The field name could not contain a ":"
  3. The data could not contain a "~"
Since ":" is rarely used as a field name and "~" is rarely used in data, this was a small price to pay. If you need "~" in the data then there are many ways to do this. One way we used was to append XML-like data to the end of the Compact and then extract the data in a XML-like way.

To extract a data element we used code like this:
@left( @right( compact ; "~" + fieldname + ":" ); "~" )
This gets the string right of the "~fieldname:" and then returns data left of the "~".

Similar code can be used to extract the XML data:

@left( @right( XMLData ; "<" + fieldname + ">" ); "</" + fieldname + ">" )
(It is XML-like so you need to handle escape sequences yourself.)

Here are some examples. You can see we even carried code and XML-like data as data.

~ID:TS~SubID:VP~Icon:44~SubIcon:~SLANo:~Z1RT:2~Z2RT:2~Z1CT:8~Z2CT:8~Z1ET:7.2~Z2ET:7.2~Pri:Standard~Desc:Telephone Service:VoIP Service (Ericsson)~SubForm:VP~LocSubForm:Location~FailureCategories:~PeriodDef:~ReactionDef:~PBVer:4.3;0~NItems:~<242</ConPrice><InstPrice>0</InstPrice><MACPrice>@If(movecomplexity="0";56;movecomplexity="1";111;-1)</MACPrice><FRDSCauses>TS-VP</FRDSCauses><ResponseTime>0</ResponseTime><RepairTime>0</RepairTime><EscalateTime>0</EscalateTime><PBProductCode></PBProductCode><PBPrice>360</PBPrice><ExcludeDays></ExcludeDays><BusinessHours>[08:00]:[17:00]</BusinessHours><BillingAudit>Options(OP)~VM(OP)</BillingAudit><Applications></Applications><LineSpeeds></LineSpeeds><Interfaces></Interfaces><CDIDType></CDIDType><MIMSNoDef>PL15TE01</MIMSNoDef><AssetCategory></AssetCategory>

~RCCode:AM~RCTitle:Asset Management~MCode:~MCustCode:000001~MIDs:~Discount:20~DiscountConnect:10~

~LocA:PTSA13~Location:George St 47, Level 12N Conference Rm~Lat:30.123456~Long:151.123456~SiteZone:Metro~


Sunday 13 July 2008

Fog Machine

This is one of those Now-what-am-I-going-to-do-with-it problems.

I picked up a box with a fog machine (Antari F-160) and fog fluid in it from a recent council clean-up. Since it was being thrown out I doubted that it would work. And it didn't.

Apart from a blown fuse, the fog machine heated up but the fluid pump would not work. The pump would hum when power (240VAC) was applied but it would not pump so I decided it was seized and set to dismantling it in order to clean it.

There didn't appear to be anything wrong. Nothing was blocked, it was clean and no part seemed damaged. If I loosely assembled the pump it would work, but if it was tightly assembled the pump would just hum.

I decided that someone on the net would be able to verify that I was assembling it correctly. I searched for the part number "SP-12A" and "Fog" and I came across this page which had a link to a picture showing the order of the parts.

In my pump, the washer/seal (part 5) was in a different place. I corrected my assembly and it worked nicely. With this seal (perhaps it is a spacer) in the wrong place, a little orange seal is compressed too much and grips the piston so tightly that it no longer can move and the pump can only hum. With the seal in the correct place, the piston moves freely and all is well.

The machine uses a remote control to turn the heater on and it has a switch to operate the pump. It was clear that the remote control cable was damaged so I repaired it, but the pump would not operate when the switch was depressed. I found that the remote control socket had 3 broken solder joints which I repaired.

The fog machine now worked.

Apart from entertaining the kids, what can you do with a fog machine?
  • Fill the house with fog and run a fire drill (get down low and Go Go Go!)
  • Check for air conditioner ducting leaks (but I don't have air conditioning)
  • Look for drafts around doors and windows
  • Cool party feature
  • Find exhaust leaks?
Maybe someone else has some thoughts.

Sunday 6 April 2008

YouTube Blackout

Introduction

I sent this to a few friends in the beginning of March 2008 (One of them posted it on his Blog). It may still be of interest but the BGP data for the YouTube event may no longer be available.

You can read more about the event on Arstechnica.

The Technique I describe may be used for any event - you just need the network address and the dates surrounding the event.

YouTube Outage BGP Replay

Last week Pakistan Telecom was ordered by their government to block access to YouTube. They did this by re-directing routes containing the YouTube addresses to an internal dead-end. A mistake was made that advertised this to their peer and thus the remainder of the internet.

The IP range that was affected was 208.65.152.0/22. Pakistan added more specific routes for 208.65.153.0/24 which are a longer match and so take priority.

To see what happened go to this site and start the java applet.

http://bgplay.routeviews.org/bgplay/

To see how it should look enter this address: 208.65.152.0/22, yesterday’s date and todays date - don’t worry about the time fields. [This route should also work now: 208.65.153.0/24]

The AS (autonomous system - roughly meaning a country or large ISP or large company) 36561 is where youtube packets should generally be sent. You can see all the lines from other domains leading to this AS for this address range.

Press the play button (a small triangle) to start the animation. You should see the links occasionally changing as changes are made or transmission links break or are fixed.

Now press the New Query button.

Then enter the address that poisoned BGP: 208.65.153.0/24, 23/2/2008 and 26/2/2008 - a period around the event.

Initially the page has no lines. This is because prior to the event, this route was not used. Just imagine all the links still going to AS 36561.

Press the Play button.

Over time you see the rouge Pakistan Telecom domain 17557 start to become the priority route for all YouTube traffic until it seems to have all the routes. When the fault was fixed you can see the links moving back to AS 36561 where they should be.

Imagine how easy it is now to interrupt any domain or the whole internet? I think this risk will be fixed shortly.

Useful OS X Applications and Utilities

This is a list of OS X Applications and Utilities that I or others I know find useful. Most is Open Source or freeware.

  • Wireshark 1.0
  • Quicksilver (apparently much more than a free app launcher - two people I know can't do without it)
  • Handbrake - for DVD => mp4.
  • MetaX - tag mp4 files.
  • VLC
  • Fairmount (required VLC) mounts a DVD as though it had no CSS.
  • MacTheRipper - rips DVD to disk.
  • Cyberduck - GUI based ssh client.
  • Text wrangler - excellent free text / code editor.

  • Firefox 3 - Should be released in June 2008. Beta versions seem stable.
  • Apptrap - uninstall apps when dragging to trash.
  • FanControl - keep macbook cooler.
  • Menumeters - cpu/network/etc. meters.
  • Vmware fusion - vm.
  • Google Desktop - shows local files when searching google. Also, fast way to run apps.
  • Google Earth
  • Open office 2.4 (aqua) - improving each month.
  • Burn and Firestarter - cd burner.
  • Cronnix - schedule tasks.
  • Istumble - wifi scanner.
  • Fink - once they release 10.5 version it will be easy to install.

Saturday 29 March 2008

Earth Hour - What happened in 2007?

Was there any significant effect and if so, was there any grid instability?

Sydney Central Business District (CBD) showed an estimated 10.2% drop in electricity consumption but only a 2.1% drop in the rest of the metropolitan areas and the grid was stable according to the Australian National Electricity Market Forum (NEMF) Minutes 78 – 26 April 2007.

http://www.nemmco.com.au/nemgeneral/057-0342.pdf

EARTH HOUR IMPACT ON THE NEM

Wayne Jackson, EnergyAustralia, presented the Earth Hour Impact on the NEM. 'Earth Hour' was the launch event of a 12 month campaign to reduce Sydney's greenhouse gas emissions.

Sydneysiders were requested to turn off their lights for
one hour between 7.30 pm and 8.30 pm, on 31 March 2007. The objective is to reduce Sydney's greenhouse gas emissions by 5% over a 12 month period.

EnergyAustralia's role was to participate in Earth Hour and also to provide load data to help measure the outcomes of the exercise. Media headline results of Earth Hour are shown below:

• 65,000 individuals and over 2,000 businesses pre-registered
• survey reported 2 million households participated
10.2% drop in Sydney CBD Consumption
• saving of 24.86 tonnes CO2-e in Sydney CBD
• saving of 52.08 tonnes CO2-e in Sydney Metro

The impact on the Sydney CBD and the EnergyAustralia network was measured.
Data was collected from 5 substations in the Sydney CBD area. Based on Saturdays in March and April from 7:30 to 8:30 pm, a typical electricity consumption profile was collected and adjusted for temperature (19.8oC on the night) and for daylight savings.

The measured load was 204.9 MWh. The typical Saturday consumption between 19.30 to 20.30 is 228.18 MWh. Therefore, 23.28 MWh of energy was saved, a reduction in energy consumption of 10.2%.

Measurement from EnergyAustralia's
Sydney metropolitan BSPs shows that 48.76 MWh of energy was saved, a reduction in energy consumption of 2.1%.

On the day network stability in the Sydney metropolitan area was very good, there
was no switching or load surge problems at 8:30 pm.

The WWF reported the Sydney CBD figures but didn't mention the much lower result for the greater Sydney metropolitan area.

http://wwf.org.au/news/congratulations-sydney-earth-hour-2007-results/

This year Earth Hour will be on March 29 from 8:00 PM to 9:00 PM (half an hour later).

http://www.earthhour.org.au/