berrycraft-promo

BerryCraft

BerryCraft, the Blackberry Minecraft chat client, allows you to chat in-game with your Minecraft buddies on the go, or administer your Minecraft server using standard commands.  Source also available for developers! More »

synthnetpromo

SynthNet

A (very) ongoing science project – the ultimate goal: a neural and genetic emulator, capable of growing fully functional, and biologically accurate, neural networks from virtual DNA. Phase 1, growing a neural network from virtual DNA, capable of associative learning and long-term potentiation, is now complete! More »

bbgamepromo

Creating a Blackberry Game

Java programmers, get your Blackberry out and start making games! This set of tutorials goes over creating a Blackberry game from start to finish by following the development of the Galactic Blast Demo, available at Synthetic Dreams. More »

shredz64promo

Shredz64

A new game for the Commodore 64 based on Guitar Hero. By making use of the PSX64 interface, players are able to connect a Playstation Guitar controller to their C64 and play a favorite modern genre of game on their beloved classic computer. More »

Personal Announcement

As can be seen by the post dates, I’ve experienced another one of my blogging hiatuses.  This was due mostly to going into crunch mode trying to finish up emissary RT – I really needed to get it wrapped up before September, as I needed to have my schedule wide open by the start of the month.  The reason why – I got accepted to grad school!  I’ll be starting the program to get my Masters in Computer Science in a few days, and needed to get this final item checked off my list.  I’m very excited for school, but equally happy to be finished with emissary RT – it was a fun project that I’ve long had the idea for, but after 2 years of development, I was ready for it to be complete.

In other news, I’ve really been diving back into my gaming roots lately.  I recently finished listening to Masters of Doom on audiobook (biography of John Carmack and Romero – get it NOW if you haven’t read it already!), and along with bringing back a HUGE slew of memories from gaming in the 90s (shareware like Commander Keen, BBSes, the start of the Internet, etc), it was also incredibly inspiring to hear the story of some passionate developers following their dreams and love of development.  Along with this, I also finally started playing with Unity, which I’ve been meaning to try for a while.  Long story short, I am completely hooked on the game engine, and incredibly ramped up to start a new game (it’s been too long since my last one), so along with working more on SynthNet, this will be my next big project.  More details soon!

New Website and Product

After many, many years (I’ve lost count at this point) of faithful service , I’ve finally refreshed the Synthetic Dreams website into something a little more modern and functional.  Take a look if you’ve got a moment, it’s built on Drupal (of course), and features a responsive design for those browsing on the go.

Additionally, after being in development for almost 2 years, I’ve finally finished emissary RT – an ODBC driver that allows you to access a whole slew of things, from your file system to DHCP and DNS.  The upshot of this is allowing you to use SQL (or the GUI in ODBC apps) to manipulate files and services in very powerful and automation-friendly ways.  You can check out the full details on the Synthetic Dreams site as well.

 

Article Featured on Qualcomm Spark Website

I realized while responding to some comments that I completely forgot to mention some exciting news!  Last month, I was fortunate enough to have an article featured on the Qualcomm Spark website, “Can We Grow Artificial Intelligence?”   It explores some of the capabilities we currently have of emulating DNA and biological growth, and incorporating these abilities into our normal programming tools to develop all sorts of AI.  I had a lot of fun writing it, as well as reading the other articles featured on the site.  So many exciting technologies on the horizon (or already here!)

 

Evolution Experimentation Module Complete

With the Genetic Mutation Engine completed, I wanted to put it to actual use.  While it’s fun to put complex SynthNet networks through the mutation process and watch the really cool looking results, manually doing it doesn’t really serve much of a purpose.  However, now that the Evolution Experimentation Module is complete, the real power of the mutation engine is unlocked.

Artificial Selection in Action

The Evolution module allows us to take an initial, manually created SynthNet network (as simple or complex as desired), test how effective it is in a task, and then either allow it to reproduce and continue on its genetic line, or prevent reproduction in the case of decreased task effectiveness.  It performs this across multiple “breeds”, or equally effective genomes, until a novel mutation shows improved performance, which is considered a new “species”.  This, in effect, emulates multiple genetic lines competing at a user-defined task, and artificial selection based on that task dictating the path of evolution of the SynthNet network.

Specifics of the module are as follows:

  1. Automatically and repeatedly mutates SynthNet DNA, grows its corresponding network, tests it, and records/compares the results
  2. Stores and manages all “breeds”, or equally effective genomes, across all mutations.
  3. Selects for new “species”, or more effective genomes, and blocks reproductions of less effective species.
  4. Detects cancerous (continuous) or unstable (requiring too much processor/memory to be feasibly used) networks and does not select for them.
  5. Can be used with any user-defined (programmed) task with a result that can be quantitatively graded, allowing full flexibility to direct artificial selection.
  6. Along with effectiveness, also stores structure (segment) count, neuron count, synapse count, effectiveness, and a graphical snapshot of each mutation.
  7. Stores all data into a MySQL database, to allow for easy continuation of experimentation after interruption, as well as viewing results on the web (coming soon!).
  8. Programmed in Python for easy use/alteration/integration.
  9. All interaction between the Evolution module and SynthNet is done via the Peripheral Nervous System Protocol, allow for remote use (allowing SynthNet to be run on a remoteserver with increased resources and client to be run at home).
  10. Also provides menu to send manual commands to a SynthNet network via PNSP for easy manual manipulation, testing, and troubleshooting.

I’m currently trying it out by artificially selecting for a neural network that can detect parity (even/odd) in numbers.  We’ll see how it does – once I have some results, I’ll be creating the front-end user interface to browse through mutations/results/pictures on the web.  Hopefully more on that soon!

 

 

Genetic Mutation Engine Functional

More SynthNet goodness today!  First off, I finished up the changes to the code that ensure all parts of SynthNet were relatively in sync with each other.  With one of my next big tasks being focusing on rate and temporal coding, timing within the system needs to be correct to support exact oscillating frequencies of action potentials, as well as resonance.  There were some significant changes to the code, so I needed to retest most parts of the entire program again.  That took up pretty much the month of August and beginning of Sept – all works well though!

Mutative Madness!

Before I took the next step and jumped into the neural coding work, I wanted to program the functionality that allows for the mutation of SynthNet’s virtual DNA, accommodating evolution experiments.  I finished up the mutation engine itself a couple nights ago, and am starting on the interface portion that will allow external programs to perform artificial selection experiments by monitoring the effectiveness of a DNA segment -  either continuing its mutation if successful, or discarding the genetic line and returning to a previous if less successful.

Below can be seen examples of the effects of mutation performed on a virtual DNA segment.  The first picture shows a network grown with the original, manually created DNA (the segment used in my classical conditioning experiment)

The next set of pictures show the results of a neural network grown using DNA that has undergone a .5% – 2% amount of mutation. Most were beautiful to look at, but the final two pictures were also completely functional, supporting the proper propagation of action potentials and integration of synaptic transmission – only with an entirely novel configuration!


Really amazing to look at (I think!)  Currently, SynthNet DNA can be exposed to the following types of errors in its genomic sequences:

  1. Deletion – Segments are removed entirely
  2. Duplication – Segments are copied in a contiguous block
  3. Inversion – Segments are written in reverse
  4. Insertion – Segments are moved and inserted into a remote section
  5. Translocation – Akin to Insertion, but two segments are swapped with each other
  6. Point Mutations – Specific virtual nucleotides are changed from one type into another

Currently, these operations result in in-frame mutations.  It was actually easier to allow frameshifts to occur – however, SynthNet DNA is more sensitive to framing, since whereas a biological read frame is across a codon (3 nucleotides), SynthNet DNA is variable from 1-6 virtual nucleotides.  When I allowed frameshifting, the results were high in nonsense mutations, which prevented almost any meaningful growth of neural structures.

Very excited with how things are turning out.  If, as seen in the last two pictures, we can get such novel pathway growth with a simple random mutation, I can’t wait to start the artificial selection routines and watch the results unfold!

 

Converting Your Corporate Intranet to Drupal

Though I have fun working on SynthNet and other projects at night, during the day I fill the role of mild-mannered network administrator at the Manchester-Boston Regional Airport (actually, the day job is quite a bit of fun as well). One of the ongoing projects I’ve taken on is adding all of our various Intranet-oriented services into a single platform for central management, easier use, and cost effectiveness. As mentioned in a previous article (linked to below, see NMS Integration), I knew Drupal was the right candidate for the job, simply due to the sheer number of modules available for a wide array of functionality, paired with constant patching and updates from the open source community.  We needed a versatile, sustainable solution that was completely customizable but wasn’t going to break the bank.

The Mission

The goal of our Drupal Intranet site was to provide the following functionality:

  1. PDF Document Management System
    1. Categorization, customized security, OCR
    2. Desktop integrated uploads
    3. Integration with asset management system
  2. Asset Management System
    1. Inventory database
    2. Barcode tracking
    3. Integration with our NMS (Zenoss)
    4. Integration with Document Management System (connect item with procurement documents such as invoices and purchase orders)
    5. Automated scanning/entry of values for computer-type assets (CPU/Memory/HD Size/MAC Address/etc)
    6. Physical network information (For network devices, switch and port device is connected to)
    7. For network switches, automated configuration backups
  3. Article Knowledgebase (categorization, customized security)
  4. Help Desk (ticketing, email integration, due dates, ownership, etc)
  5. Public Address System integration (Allow listening to PA System)
  6. Active Directory Integration (Users, groups, and security controlled from Windows AD)
  7. Other non-exciting generic databases (phone directories, etc)

Implementation

Amazingly enough, the core abilities of Drupal covered the vast majority of the required functionality out of the box.  By making use of custom content types with CCK fields, Taxonomy, Views, and Panels, the typical database functionality (entry, summary table listings, sorting, searching, filtering, etc) of the above items was reproduced easily.  However, specialized modules and custom coding was necessary for the following parts:

  1. Customized Security – Security was achieved for the most part via Taxonomy Access Control and Content Access.  TAC allowed us to control access to content based on user roles and categorization of said content (e.g. a user who was a member of the “executive staff” role would have access to documents with a specific taxonomy field set to “sensitive information”, whereas other users would not).  Additionally, Content Access allows you to further refine access down to the specific node level, so each document can have individual security assigned to it.
  2. OCR – This was the one of the few areas we chose to delve into a commercial product.  While there are some open source solutions out there, some of the commercial engines are still considerably more accurate, including the one we choose, ABBYY.  They make a Linux version of the software that can be driven via the shell.  With a little custom coding, we have the ABBYY software running on each PDF upload, turning it into an indexed PDF.  A preview of the document is shown in flash format by first creating a swf version (using pdf2swf), then using FlexPaper/SWF Tools.
  3. Linking Documents – This was performed with node references and the Node Reference Explorer module, allowing a user friendly popup dialogs to choose the content to link to.
  4. Desktop Integration – Instead of going through the full steps of creating a new node each time, choosing a file to upload, filling in fields, etc, we wanted the user to be able to right click a PDF file on their desktop, and select “Send To -> Document Archive” from Windows.  For this, we did end up doing a custom .NET application that established an HTTP connection to the Drupal site and POSTed the files to it.  Design of this application is an article in itself (maybe soon!).
  5. Barcoding – This was the last place we used a commercial product simply due to the close integration with our barcode printers (Zebra) – we wanted to stick with the ZebraDesigner product.  However, one of the options in the product is to accept the ID of the barcode from an outside source (text/xml/etc), so this was simply a matter of having Drupal put the appropriate ID of the current hardware item into a file and automating ZebraDesigner to open and print it.
  6. NMS (Zenoss) Integration – The article of how we accomplished this can be found here.
  7. Automated Switch Configuration Backups and Network Tracking – This just took a little custom coding and was not as difficult as it might seem.  Once all our network switches were entered into the asset management system and we had each IP address, during the Drupal cron hook, we had the module cURL the config via the web interface of the switch by feeding it a SHOW STARTUP-CONFIG command (e.g. http://IP/level/15/exec/-/show/startup-config/CR) – which was saved and attached to the node.  Additionally, we grabbed the MAC database off each switch (SHOW MAC-ADDRESS-TABLE), and parsed that, comparing the MAC addresses on each asset to each switch port, and recording the switch/port location into each asset.  We could now see where each device on the network was connected.  A more detailed description of the exact process used for this may also be a future article.
  8. Help Desk – While this could have been accomplished with a custom content type and views, we chose to make use of the Support Ticketing Module, as it had some added benefits (graphs, email integration, etc)
  9. Public Address System – Our PA system can generate ICECast streams of its audio.  We picked these up using the FFMp3 flash MP3 Live Stream Player.
  10. Automated Gathering of Hardware Info – For this, we made use of a free product called WinAudit loaded into the AD login scripts.  WinAudit will take a full accounting of pretty much everything on a computer (hardware, software, licenses, etc) and dump them to a csv/xml file.  We have all our AD machines taking audit during logins, then dumping these files to a central location for Drupal to update the asset database during the cronjob.
  11. Active Directory Integration – The first step was to ensure the apache server itself was a domain member, which we accomplished through the standard samba/winbind configurations.  We then setup the PAM Authentication module which allowed the Drupal login to make use of the PHP PAM package, which ultimately allows it to use standard Linux PAM authentication – which once integrated into AD, includes all AD accounts/groups.  A little custom coding was also done to ensure matching Drupal roles were created for each AD group a user was a part of – allowing us to control access with Drupal (see #1 above) via AD groups.

There was a liberal dose of code within a custom module to glue some of the pieces together in a clean fashion, but overall the system works really smoothly, even with heavy use.  And the best part is, it consists of mainly free software, which is awesome considering how much we would have paid had we gone completely commercial for everything.

Please feel free to shoot me any specific questions about functionality if you have them – there were a number of details I didn’t want to bog the article down with, but I’d be happy to share my experiences.

The Beauty of the Demoscene

In this uber-connected, social media driven world, it seems like the time between when an idea is born and when it completely saturates the Internet twenty times over is almost nil.  While this does mean seeing Dramatic Chipmunk and Nyan Cat until the point of retinal damage, it also has the benefit of introducing the masses to really cool ideas and projects from all around the globe.  It means more people sharing their creations, which is a win-win for everyone.

Because of this mass spread of information, it always surprises me how many people are unfamiliar with the demoscene.  Having grown up a Commodore 64 (and later Amiga) kid who hung out on BBSes, intros and demos were always a part of my computer world.  At that time, they were amazing, mysterious creations, made by programmers with futuristic-sounding handles from far away countries.  As I grew older, I started to not only befriend many sceners, but also think more about both the Computer Science and art that actually went into these – and my amazement only increased.  Now I do everything I can to show off these programmatic, musical, and artistic feats to anyone who will watch.

The Scene

To quote Wikipedia, “The demoscene is a computer art subculture that specializes in producing demos, which are audio-visual presentations that run in real-time on a computer. The main goal of a demo is to show off programming, artistic, and musical skills.”  Originally, they started as shout-outs and other introductions in game cracks on 8-bit computers, showing off programming skill.  They quickly bloomed into an entire culture of demogroups, competitions, parties, boards, etc – and is still going strong today with a strong European core (though still prevalent in the US!).  I encourage you to learn more about all the awesome history behind the scene – there is more than can be covered in one blog post.

Favorite Demos

While the history is interesting, what is more important are the demos themselves!  Below I’ve included 4 of my favorite demos.  The first two are 64K PC demos.  When I say 64K, I mean the entire demo is 64K big.  Graphics, music, code – everything.  This is procedural programming on steroids – artistic and algorithmic wonderment.



 

The second two are for the Commodore 64.  While they are more limited by the hardware, the talent still shines through.  The first is a great example of an amazing musical score, and the second is unbelievable coding and use of the C64 hardware, making it look more like a 16-bit machine.




This is just a taste of what has come out over the years – I encourage you to take a look at sites like pouet.net and The Commodore Scene Database for some more examples.  Be prepared to be amazed!

Helping the World Through Software

Recently, I started talking with my girlfriend about the idea of writing a life plan.  The idea is similar in nature to a business plan, but instead of outlining the structure, mission statements, and strategies of a financial venture, you’re focused on the values, goals, and eventualities of your life as a whole.  I’ve researched a bit online, and the more I thought about it, the more I realized what a completely awesome tool a life plan could be – not only for organizing your life, but just the process of writing one can really illuminate and flesh out life-goals.  More importantly though, as I realized by talking with my friends, it can truly be a living document, one that grows over time as life, values, and situations change.

Though I am only in the planning stages now of what I want to include in my plan, I know before I put a single word down that there are two items that I will inevitably focus on.  The first is one of my true passions in life – creating.  Specifically, creating through computer science – games, AI, network utilities, or anything.  But ultimately I know this isn’t truly fulfilling.  I read article after tweet after blog post about software development and computer science – and some writing inspires me, and some falls flat.  It took me a while to figure out why, and as of late I realize more why that is.  Which brings me to the second item I will focus on – helping the world.  If I have a limited time on this big, blue globe, I want to do whatever I can to ensure that hopefully, at least in some small part, my creations will make the life a better place.  This – and making connections with other people who want to use their awesome skills to do some serious good!  I’m lucky enough to lots of friends with this attitude, and I’d love to make more.

Resources

To say there are a lot of amazing organizations out there changing the world on a daily basis would be an understatement – our lives change constantly with the evolution of social networks, mobile devices, and interconnectivity.  And while many of these changes attack very real problems and improve quality of life, there is still infinite amounts of space to effect positive change – still countless opportunities to do good.  And I think it’s important to deliberately focus on these items as a core goal.   I’ve recently begun to search online for resources and other like-minded buddies to help in this quest – and I’ve found a number in academia, as well awesome sites like TED that have some truly brilliant people focused on these very issues.

If you know of any other resources that talk about helping the world through computer science or other technology-driven philanthropy, please feel free to send them this way!

Or if you have any experience with writing a life plan or steps you’ve taken to clarify goals for yourself, please feel free to drop me a line!

I know there are other people much smarter than me who have tackled these areas before, so I’d love any guidance or tips.  I hope to continue to post on these subjects as I learn more and make further connections.

 

Finished up Multithreading in SynthNet

With BerryCraft done for now, I’m jumping back into SynthNet again – finished up multithreading today.  Now the emulator can utilize multicores/multiple processors, which allows substantially larger neural networks (which means more complex behavior and fun to be had).  There is still quite a lot of optimization to be done, but this was the biggie by far – I wasn’t really able to make anything much larger than the previous test until I got this working.

I changed the DNA from the demonstration a bit and grew a network with more neurons in the neural pathway, then had it listen to a piece of music (The Duel from Electric Dreams).  Captured a quick video of it – no practical demonstration, but fun to watch!

BerryCraft Source – Jave Minecraft I/O Engine

Below is the source to BerryCraft.  Most of the project is pretty trivial – however, MCIO.java (MineCraft IO) contains a class capable of full communication with a Minecraft server, correctly sending/receiving all Minecraft packet types as documented here.  BerryCraft itself only implements chat/time functionality (and logins of course), but the MCIO could be used to build out any Minecraft functionality (player positioning, mob spawning and attributes, inventory, etc).

If you do end up using MCIO, let me know, I’d love to see the results!

BerryCraft Source