I just heard that Oath have released the Vespa Search Engine open source. When I was in Yahoo! it was used for everything. I predict that a company will exist in a few months selling it as a service.
I think Vespa was one of the be best written pieces of infrastructure at Yahoo! that I worked on. It was well documented for an internal app and it was blazingly fast. The guys working on it were also super smart. I will be interesting to see where it goes but I think it will be a contender in the search space.
The following uses openssl to generate a random binary sequence “N” bytes long encoded in hex or base64. The “N” is all important. If you use base64 you get uppercase characters in the string which some sites require, you can also use hex which makes for a more readable string but it will be longer and if the people cracking your password are using an offline database to crack the password longer random strings are better (this assumes they don’t know how you encoded you’re random bytes).
openssl -rand -base64 32
openssl -rand -hex 32
How long should “N” actually be? This table at wikipedia lists the given entropy in bits for various character sets. If using hex and you want 80 bits of entropy you need 16 bytes. Base64 is not listed but the character set has 64 symbols so the entropy is similar to the case sensitive with numbers which has 62 symbols so 80 bits would require 12 bytes. The number of bits you choose for entropy is up to you.
A question came up on cofounderslab asking about how to deal with pages that changed content based on geolocation, in particular it was about google indexing the content.
The problem is trying to serve geo specific content on the same URL’s, this is breaking URL Semantics. The “R” in URL means Resource and in the case given the city is the resource, at least when I searched it was city specific content that I got on the site, if the city changes based on users location the resource is changing and this breaks the semantics. There are two ways to deal with this, one is sub directories and the other is sub domains.
You can redirect the user to a subdirectory ie
You can hit some gotchas with cities with the same name etc but this way the google index will see each url as unique content which it is. An example of sub directories is yelp, if I visit yelp at their main domain I get redirected from
Craigslist uses subdomains, if I go to their main website I get redirected to https://sfbay.craigslist.org/
The thing to remember is that a web resource should remain fixed at a single URL, the method chosen has some ramifications but as you can see from Yelp and craigslist both methods work well.
I’ve used both methods and the sub domain route is a bit more work because it involves DNS and some way to manage it that isn’t manual but it’s easily doable ie attach DNS to a database etc.
The following is where I’m at with Jenny’s portrait. I started off with a rough drawing and painted in the grisaille ie mostly black and white until you have the main drawing and tones complete. This looks gray because the grisaille is very much still showing. This painting requires a lot more work. A good example of what needs doing is the eyebrow, it looks painted on which it literally has been but it should not look that way 🙂
The reason for the composition came from Rembrandt. He had a tendency to light his subjects very strongly from one side or from above, with this portrait I wanted something similar but I also liked the fact that it looks like Jenny is emerging from the dark. I’m not sure how it’s going to turn out yet but it’s getting there.