Some quotes for you, to give you what I'm feeling.
It's a gimped tablet computer.
Like the internet? Well you don't get to use all of it. No flash (flash is terrible, but you miss a ton of the internet without it)
Okay I'll just listen to internet radio while doing some emailing and facebook-ing. No multitasking.
I want to draw on it. No stylus
Okay I'll just type up some notes on it during class. No keyboard. Onscreen keyboard works well for your thumbs on a phone size device - for full regular typing with no tactile feedback? No.
Okay I'll just load pictures from my digital camera on it and use it as a big gorgeous portable display device. NO SD FUCKING SLOT. NO USB SLOT. WHAT THE FUCK.
$500 fucking dollars. HALF a thousand dollars.
Via Ars Comments
Image (C) Engadget (thus hotlinked via my usual policy)
And am I missing something or does this not do handwriting recognition? You know, like the Windows Tablet PC software has since 2002?
The eBook reader stuff is another example of Apple mimicking real life objects unnecessarily. Creating a "library" page that looks like a real bookshelf and a book interface that visually resembles a book does not make this "easier to use" or "nicer." It makes it unprofessional looking, actually. Childish.
And don't get me started on the superiority of eInk over any screen display. It's no contest unless you're trying to fast track to bad vision.
- There's no multitasking at all. It's a real disappointment. All this power and very little you can do with it at once. No multitasking means no streaming Pandora when you're working in Pages... you can figure it out. It's a real setback for this device.
- The ebook implementation is about as close as you can get to reading without a stack of bound paper in your hand. The visual stuff really helps flesh out the experience. It may be just for show, but it counts here. Comment: Still not E-Ink. If the software is good it might be better than a standard PC, but really?
- No camera. None, nada. Zip. No video conferencing here folks. Hell, it doesn't have an SMS app!
- It's running iPhone OS 3.2.
- The keyboard is good, not great. Not quite as responsive as it looked in the demos.
- No Flash confirmed. So Hulu is out for you, folks!
My god, am I underwhelmed by the iPad. This is as inessential a product as I've ever seen, but beyond that, it has some absolutely backbreaking failures that will make me judge anyone who buys one.
And here I was thinking if it was implemented well I might eventually upgrade my netbook to one. Hah. I probably wouldn't even have a post on this if it wasn't for years and years of rumors to come out with, well, this.
For more on the underwhelming: PC World, CBS, Newsweek, ZDNet informal poll, Lifehacker informal poll. When David Pogue has little good to say, and even tries to spin backlighting as good, you know something is wrong.
First, I realize I missed this week's Tuesday Tetrapod. I'll put up a double-feature next week — but I've been trying to meet a personal deadline and didn't quite have time to give the TT the attention it deserved. So, let me touch a little bit on SVGs and, thus, in a roundabout way, what's been occupying my time.
First, "SVG" stand for "Scalable Vector Graphic". As the image at the right demonstrates, zooming in on raster graphics such as bitmaps, PNGs, or JPGs introduces artifacts. Further, if you want to reuse the image, you cannot scale it beyond a certain resolution. Vector graphics, on the other hand, are very much like text in that their descriptions are essentially plaintext describing how lines arc. The lines arc the same way no matter how zoomed in you are, so they are rendered on the fly by computers. This means that they are infinitely resizeable, and retain fidelity limited only by the physical pixel sizes on your monitor.
Now, the problem is, of course, Internet Explorer. It's not the only problem, but the main one. See, IE doesn't support SVGs at all. Just can't do a thing with them. While Firefox has finally passed IE6 in market share, IE still holds ~65% of the global market (though on technical sites, its market share is closer to 20%). So this was a game-stopper to SVG adoption.
Now, with that hurdle done with, we have a few other hoops to jump through. First, of Gecko, Webkit, and Presto-based browsers (read: Firefox, Safari + Chrome, and Opera), Gecko and Webkit each incompletely support external SVG resources with the two tags that should work, <object> and <img> (Presto properly supports both). Gecko does not support <img> at all, and instead displays the alt-text; Webkit supports <object> but manipulating sizes and such does not work.
Now, I wanted to use an SVG-based logo on my site, and I was fed up with this implementation problem, so I wrote up a simple PHP library to get SVG working conveniently on multiple browsers. Now, to use an SVG, all you have to do is
<?php dispSVG('PATH/TO/FILE.svg'[,'ALTERNATE TEXT',WIDTH_IN_PX,HEIGHT_IN_PX,HTML_ID,HTML_CLASS]); ?>
Where everything in the square brackets is optional (do note, however, that if no height or width is specified IE will default to 100px by 100px). So, for example, on this implementation test page, the logo with the yellow background is just in:
<div style='background:yellow;width:500px;height:600px;'> <?php dispSVG('assets/logo.svg','logo',450); ?> </div>
That's it. Nice, simple, clean and cross-browser. All you have to do is extract the library to your server in a location, edit the $relative variable in svg.inc if it's not in the root directory, and paste the following into the head of your (X)HTML document:
<?php require('PATH/TO/svg.inc'); // edit this path if(browser_detection('browser')=='ie') echo ""; ?>
Once you've done that, you're done! Just call dispSVG whenever you want to display an SVG. By giving the ID or class calls a value, you can address your SVG as normal in you CSS. Nice and easy! It's not quite perfect, as CSS background only works with Presto (of course. Go Opera!), but it works for most uses.
Now, all that is because I'm working on creating a site to advertise my website creation and deliver built-to-order computers for customers. The computer building system is almost done, requiring some modification on the last four system types to give configuration options. Then I just need to generate some content for the other pages, and it'll go live!
So, I ask (what readers I have) a favor: Can you please check out the site, and give me any stylistic, functional, or content critiques, reccommendations, or comments you may have? With luck, soon it should have the proper domain http://www.velociraptorsystems.com/!
Oh man, this article from ArsTechnica just says so much. The skinny:
Monticello, Minnesota was getting bad internet service. The voters passed a referendum to have a municiple, fiber-to-the-home service. TDS, a local telco, sued. And sued. And sued some more. And after stonewalling the government for long enough, they unveilled today a 50 mbps symmetrical fiber-to-the-home service for all residents, at $49.95 per month.
Such stories aren't limited to Minnesota suburbs, either. Just last month Telephony Online ran a piece on how Cox cable prices had "dropped considerably" since Lafeyette, Louisiana lit up a fiber system of its own.
"Cox froze the cable rates in Lafayette, and they didn’t freeze the rates in other areas," said Terry Huval, director of the muni project. "We figured our citizens saved over $3 million in cable rates even before we could offer them service."
Big surprise, we don't have enough competition and when we suddenly get it, hey presto, prices drop and service improves — even if the government (albeit local) has to inject competition into the marketplace.
So, first, this really drives home the whole "Telcos are retarted and kinda vaguely evil" point, second, we do not have enough broadband competition, dominated by two or three carries, covering 75-80% of customers.
Third — does this have an analogue I'm missing? Hm... oh wait. A goverment-sponsored public option is supposed to do the same bloody thing. You know, force competitive rates among insurers. Lower prices for Americans. Etc. But naaaah, we've never shown that'd work. Oh wait.
Yes, there's more to the health care bill than that, with conservative estimates sponsored by the America's Health Insurance Plans (read: insurance lobby) even showing a 47% decrease in premiums WRT today's levels with a no-PO bill. But it's what gives the bloody thing teeth!
Oh, net neutrality. As you may know, the FCC proposed rules for net neutrality last week and opened the stage up to commenting. Here are the proposed rules:
- Consumers are entitled to access the lawful Internet content of their choice
- Consumers are entitled to run applications and use services of their choice, subject to the needs of law enforcement
- Consumers are entitled to connect their choice of legal devices that do not harm the network
- Consumers are entitled to competition among network providers, application and service providers, and content providers.
- A provider of broadband Internet access service must treat lawful content, applications, and services in a nondiscriminatory manner
- A provider of broadband Internet access service must disclose such information concerning network management and other practices as is reasonably required for users and content, application, and service providers to enjoy the protections specified in this rulemaking
Now, I do have some issues with this, including the framework for "reasonable" management:
But overall, I feel like this is a goo, large step forward. In my opinion, net neutrality rules should be quite simple:
- ISPs may not discriminate, manage, inspect, or otherwise treat any set of data over their network in any way that is not applied unilaterally to all data over the network in an identical manner. Random fluctuations in network reliability are permissible, and subject to analysis by a p-test to confirm random distribution and other tests to determine such fluctuation is protocol-agnostic.
- Access to content may not be restricted or controlled in any manner, unless such content comes from an IP address with zero legal content. At that time, restriction to access of that IP address is at the discretion of the ISP, but not mandatory.
- None of these principles affect the legality of actions taken on the internet.
- Violations of these principles are subject to a $100,000 fine (per instance).
In other words, I'd love to see network neutrality legislation that favors a "dumb pipe". But OK, that's fine, like I said — this is a good step. Who couldn't favor it?
It turns out that John McCain (and though it's not explicitly mentioned here, a number of other Republican representatives taking large donations from ATT, Verizon, and/or Comcast) have a bone to pick:
The money trail is actually painfully obvious. But, it doesn't change the fact that the openinternet.gov site is being deluged with anti-net-neutratliy comments from those getting their pockets filled by TelCos. The FCC is seeking out comments, and it's important that you comment — either the quick way on openinternet.gov's discussion page, or make a more "official" comment by submitting a filing on ECFS (Proceeding/Docket Number "09-191").
Help keep the net open! If you ever have doubts, just consider for a moment how hard it is to change your internet carrier ... and if they decided to, say, cut off Google from you, what your recourse would be. That is what Net Neutrality is about.
This morning I was part of the Microsoft "New Efficiency" developer track meeting and I thought I'd use this as an interesting blog post in addition to a way to take notes.
I was part of the Windows Server 2008 R2 track — while I'd have preferred to be in the Windows 7 track, considering the project I've been working on this is arguably more useful. Unsurprisingly, I am the only one I can see running linux (Nike, my Asus EEE PC 701 series notebook [name after theGreek winged goddess of victory] has insufficient space to run any versio of Windows effectivley, and even Ubuntu can be cramped). So, without further delay, Windows Server 2008 R2:
- 905: Starting the program
- Bulk of content applies toWindows 7 additionally. It's noticable that theUI has vestiges of the Win7 interface. I'm not sure if the lack of Aero is a product of the computers or the OS
- Scales to at least 256-socket solution
- 64-bit only. Makes a lot of sense and about time.
- Win 7 64 and R2 are essentially the same kernal -- but64-bit only. It seems like this implies Win7-x86 is a backport.
- DWM is optimized, to about half the memory usage
- Native VHD support &mash; they're mountable an showupin disk management.
- There's a live demo, switchedright into server management and handle it from disk management.
- There's a .NET API to access virtual disksdirectly. Interesting.
- The native mounting means the filestructure can be used without launching the VM
- Services won't start until they're needed, finally — such as, until they're attached (ie, bluetooth service). Sample service on cfx.codeplex.com
- Essentially, it seems like services don't have to be manually set to manual, if the programmer codes things properly.
- "Server Core" is a GUI-less Server, so the whole thing is more streamlined
- 926: Server core and ASP.NET
- Server core loads up a terminal session. "No GUI" is a bit misleading, because there is still a mouse and such -- but it's just one large command prompt, essentially.
- By far most components can be disabled -- out of a list of perhaps 50 items, only four were enabled.
- The server terminal is case-sensitive. Mixed blessing, but I'm used to it anyway.
- Notepa is installed as the primary text editor. It needs Emacs ;-)
- By the way, this presentation was being streamed,so you might be able to watch it if you're interested.
- Still, the percentage bars are horribly misleading. XKCD anyone?
- Speaking of XKCD, Geocities is shutting down today. Back up your websites with HTrack if you want to save your nostalgia.
- Impressive &madsh; the server core instance was using only 350 MB of memory while applications were up.
- 937: Powershell
- Powershell now has remoting capability. I'll have to see if it's built into Windows 7, might be able to start scripting directly into that.
- There's Powershell integrated scripting environment — nicely readable
- get-command is a nice command definition, a mini-man.
- invoke-command can run a session entirely remotely, to the extent that commands defined on the remote machine are not accessible locally.
- I like the Powershell capabilities. get-process is like a mini-top
- Not all methods are defined when remoting
- On the other hand, you can use a PSSession to essentially ssh/telnet into a machine, which leaves most methods available.
- Blazing fast coverage, but looks like the initial talk for Server 2008 R2 is done.
Next: Parallel programming(More)
OK, I lied. Programming the brain first, even though I wrote it as if I wrote about the body already, it should still mostly make sense.
The core fact to realize about our own conciousness, etc, is the fact that we are not programmed with it. Trying to program self-awareness into a computer simulation has problems because there is no place to start from. Our behaviour is an emergent behaviour, so we need to duplicate this in an simulated fashion.
Given that we've already talked about the modularity of the body plan, how is this body controlled? We assign a virtual neuron cluster to each point of freedom, with each virtual cluster having members equal to the granularity we want to start with. Thus, consider the human elbow. It would be one virtual neuron cluster to control its up-and-down motion (like you're flexing your arms). The number of virtual neurons in this cluster is determined by:
- 1 per degree of rotation. About 160 for the human elbow.
- Each virtual neuron can have an arbitrary number of connections within the cluster, but regardless of input signals recieved only outputs one unit of amplitude. We then have 160 control neurons with one "lead" outside the cluster. Control neuron 2 fires two movement neurons, instigating a 2o rotation. Control neuron 139 fires 139 movement neurons, for a 139 degree rotation. Thus, "basic" members in the available movement pool have more connections (more control neuron connections) than more extreme members (ie, movement neuron 160 is only connected to one control neuron)
These virtual neuron clusters are then batched into virtual superclusters, controlling the larger body segment. So, a human shoulder would have one cluster for swinging rotation, one for "flapping" rotation (like jumping jacks). The supercluster has a set of control neurons, broken into a few classes:(More)
I was thinking about AI, and I came to the conclusion that, well, computer scientists are doing it wrong. This is a problem with computer scientists trying to do something incredibly improbable, and incredibly difficult, in one giant step.
Instead, I posit, we need to take the route that evolution took, and evolve our way to an artificial intelligence. Much of our behaviour (and after all, we want to get something basically human) is baed on our evolutionary history. We don't think about being happy — we don't go "oh hey, my brain is telling me that for social reasons it's a great idea to smile right now, because I happen to be happy", either. We just do it — that's the essence if instinct, and much of it is vital to the way an AI would need to behave.
Along with my much delayed series on web design, I'm going to write a brief series on how I believe one could evolve their way to AI, but by doing so at a vastly different level than has been done before. This will focus on 4 things:
- Directed evolutionary paths — how to drive a model organism to intelligent behaviour
- Modelling parameters and sensory analogs inside a virtual world
- Controlling behaviour in a virtual environment
- Population sizes and sexual vs. highest-fitness reproduction
- CS and engineering implications of this approach
The crux to this whole thing is that, ironically, we'd need to bastardize evolution a shade to pull this off, as we'd play the dubious role of a "guided intelligent designer". Since the model computer world would, necessarily, have less than 5 x 1014 m of surface area, and run what I term "sequence events" at slower than real time, without the vast variation of the Earth's surface, we can't hope to ever run a full simulation of 350 MY or so.
Instead, we would run small sequence events that build on each other, eliciting certain classes of behaviours in medium populations (say, 50 individuals). After their rate of change has stabalized for, say, 1,000 generations, modify the conditions of their environment and pick the top 50% of individuals to expand their behavioral repretoire. Every 100 generations or so, a population snapshot can be made for research, fallback, and archival purposes. Only the top 50% of individuals (of each sex) are always allowed to breed, randomly, with each other, at exactly replacement rate. The only things modified by the programmer (after initial set-up) are the environment, a "fitness bonus" for certain goals, and a "fitness function" that assigns a fitness value to each individual for a given point on the evolutionary trajectory.
I propose an evolutionary trajectory as follows:(More)
A surprising amount of this weekend was dedicated to working on phylogenies. I updated non-eusuchian crocodylomorphs, avialae through neoaves to at least extant order; minor updates to sauropoda, and an update on sauropterygia.
The site also finally got a long-needed search engine. It's not the most efficient thing in the world, but doing a binary search is essentially useless. I made it modular, so I may still end up seeing how a sort + binary search with preserved keys turns out, or cut out the cruft (change the amount of the document searched).
I finally also added a dirty implementation of tagging. I used the <tt> (teletype) tag and changed its contents to not display (CSS display:none). Thus, tags for an entry can be hidden in this element, and will be read by the search engine, but not displayed.
I also noticed that it was hard to find some of the sources I cited with just name and year — so I've started adding linkouts, DOIs, or ISBNs too all sources provided, to make it easier in the future.
Kind of looks less in writing than it felt like ... but I'm happy with the way it's going.
Since I've wasted most of today on this (oh feature creep. And an extra "-1".), I thought I'd share. Reading around on Pharyngula, I found myself wanting to make a more customizable, even-less-preprogrammed version of Richard Dawkin's WEASEL program. So, I present you with a Python 3.0+ version of Dawkin's "Methinks it is a weasel" program. It will take any nonzero length starting string, has a maximum generational cap for local maxima, with customizable rates for single-bit errors (like SNPs), duplication errors, number of offspring, weighting for approaching the target sequence, and accepted variability, and whether bad mutations are penalized or not.
The point of the program, as published in The Blind Watchmaker, is to show that with random mutations over time you can expect to see something like order pop out of a random test string. The version I have below takes the genetic modelling a bit further, starting with any string, regardless of length, and duplication / omission errors will trim down to the right solution. In debug mode, there's an additional completely random selection every 500 generations to help pull the program out of local maxima, but it is not in the primary program.
Click the fold to read the source, or download it here(More)
There's a degree of subtlty in webwork that is missed, but is particularly noticible when you spend a lot of time designing sites, be it for yourself or for clients. I think that I'll break this down into a few miniposts, to share my ideas on the topic, give a bit of threading through a few posts, and, well, to hopefully bring some ideas out. Right now, my plan is to look at:
- Code validity
- Aesthetics vs. Usability
- Simple engines and security
- Website modularity
- Front-end website interfaces
Which brings us to this first post on code validity. Now, that is a bit of an obscure choice to start with. Really, who cares? The number of sites lacking a Doctype declaration is fairly amazing. People still mix display markup with content markup. What's with all this, well, kvetching about web standards?
Let's start with the doctype declaration. For example, this blog uses (for now, anyway, and valid as of this posting) the XHTML 1.1 Transitional doctype:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
This tells the browser to use this style of interpretation when rendering the markup. In particular, the browser should handle certain not-strictly-OK bits of code, such as <input> tags existing outside of block level elements (that is, elements that occupy a chunk of a webpage instead of being able to exist in-line).
This touches on another aspect of valid code. Different types of elements have different nesting requirements. For example, the tag used to write this paragraph, <p>, is by default a block level element. It occupies a block of a webpage, and cannot have other block-level elements inserted into it, like a second paragraph element, or a <div> element (as a general rule; some elements are special, notably <div> and <table>). A block level element creates a new line when formed.
By contrast, inline elements, such as the anchor tag (<a>, most commonly used for links) are inline elements. They cannot exist outside of block-level elements. This seperation of block and inline level elements simplifies layout design and makes websites render more faithfully across browsers; when this isn't done, the browsers need to make a non-standardized choice about how to seperate these items. Do they form a new line or no? Are the spacings different? Does it implicitly form another block level element until closed? And so on.
The next item is a bit of a change from the old style of coding back in the 90's and early 2000's. It used to be that you could see bits of code like this:
<p><font color='green' type='comic sans ms'>Whee green!</font></p>
Now, you would see one of these:
<p class='greenpara'>Whee green!</p>
<p id='greenpara'>Whee green!</p>
<p style='color:green;font-family:comic sans ms,segoe print,pristina;'>Whee green!</p>
Which would all render
So what is the difference? In the first example, the code's style information was a different element, both block and inline, that would live with the content markup. However, in the latter three, the content is just the paragraph element, and the styling information is referenced by a class identifier (can apply to any number of items with that class), and ID (just one item), or a different style of code, applied to the entire paragraph element. This is known as CSS, or Cascading Style Sheets. This means you can actually have multiple styles for a page, as well demonstrated by the CSS zen garden, which has one page with multiple stylesheets to give it vastly different stytles. Less amazingly snazzy ways of seeing the same thing can be seen in the beta site I had done (wow, has it been a year?) for the UCMP. Over here, you see the full site in all it's CSS glory; over here you see the site with the external CSS sheet stripped out, and only inline elements remaining, and here an alternate color scheme that never got off the ground (and is thus incomplete).
Part of the beauty of CSS was that it allowed stylesheets to be moved externally to other files, and then referenced by <link> or <style> elements. This means the one stylesheet could be used by multiple pages, and be updated seperately and have it reflected across all pages simultaneously. For example, you can see the layout code for this site here.
If you want to check out the validity of your code, you should check out http://validator.w3.org/ — at the time of this writing, the front page (and all the posts on it) have been marked up correctly and return a valid page.
Till the next entry ...
First, Liz's replacement:
An Eee PC 701, which I've had about a month now, running linux. Itty bitty, but nice an mobile, with a good battery life, It supplements Athena nicely. I've named it Nike, after the Greek goddess of victory.
Now, regarding my previous post on Windows 7, there were a number of issues with my graphics, but I had assumed it was a bad VRAM sector. Turns out, upon opening up the computer, my Koolance GPU cooling block was leaking.
Of course, it was out of warranty (an I hereby remove blame from ASUS and ATi for the failure). For now Athena is running open, with the block leaking into a bowl. I suspect large liquid pooling was avoided by the orientation of the cooler and the heat of the GPU. The new ATi 4850 (512 MiB, Sapphire) runs beautifully, and my current system specs give this ratnig:< !-- System Rating -->
There is a weir problem with my system reporting 8 gigs but only addressing 6 — I might have to switch up the slots for the most recently added bit of RAM. So, this weekend I'm going to try to drain my system an replace my GPU block with my second CPU block (for my as-yet unpurchased second CPU). We'll see how it goes. Hopefully no insane hardware failures!
When I get to my desktop, I will replace the huge jpgs with something a bit more reasonable.
So, I'm temporarily down from Berkeley in Glendale, and, as the life goes for the computer nerd of a family, it's time to take a look at the computers. As I wait for them to crawl their way into the next update or remove the next piece of crapware, I thought I'd post an article on some computing tools and practices for Windows.(More)
So far, I've fixed a video problem (needed ATI's Windows 7 update), audio problem, network problem (both needed reinstalled drivers), Hamachi problem (again, reinstall) ... I really want to like it, and I really like what works.
But, at least for the 64-bit upgrade path, there is something to not like:
Updates pending as things get fixed. In a bizzarre way, I hope it's my video card — then at least the problem will fix itself when it gets replaced ... if it's the 64-bit upgrade path, then HEY MICROSOFT. NEW BUG FOR YOU!
For those of you who run Windows 7 and want to send out a bug report, enter the following:
> rundll32.exe FeedbackTool.dll,ShowWizard
Now, can I fall asleep at 2am? That's the burning question.
UPDATE 1: Apparently, according to the TechNet forumns, this is a known (as yet unresolved) bug. Here's hoping they figure it out sooner rather than later.
For those readers who might want to test-run Windows 7 Release Candidate, I've uploaded a 7-zip package to http://filehost.revealedsingularity.net/Win7RC_Prep.7z that has:
- A *.wmv video of how to install Windows 7 from a flash drive (via MS Taiwan, if I recall)
- A text file with a link to the Windows 7 RC download
- db.xml, a file to go with Microsoft's File Checksum Integrity Verifier utility. It contains the MD5 and SHA-1 hashes of the 64-bit RC.
- The FCIV utility
You can download the release candidate over at Microsoft's site. I'd strongly recommend using a download manager, or a browser with one built in, such as Opera.
Once downloaded, you can display the hashes (MD5 and SHA-1) in the DB file, then compare them to your own. To do so, type at your command line:
> fciv -xml db_new.xml -sha1 7100.0.090421-1700_x64fre_client_en-us_retail_ultimate-grc1culxfrer_en_dvd.iso
> fciv -list -sha1 -xml db.xml
> fciv -list -sha1 -xml db_new.xml
If the two SHA-1 hashes match, your download is uncorrupted. Have fun! I plan on installing it myself in the next week or so. If someone bothers to hash out the 32 bit, let me know and I'll update this post with those hashes.
- SHA1: FC867FE1AB2E0A9796F9E4D155B44EA6998F4874
- MD5: 98341AF35655137966E382C4FEAA282D
And just so it's not entirely worthless for everyone that is not interested, here's a background from Windows 7 Beta 1 (click for full size):