The Atlantic, Cities: “People would live in small city clusters built around a town center replete with stores, offices, schools, public buildings, and parks. Traveling around town, residents would take the “light” road network. They would walk, bike, or drive tiny cars incapable of exceeded 25 mph. There would be no on-street parking at all. The general idea is to promote interaction and accessibility.”
I really love this idea. We live in a neighborhood where our homes are tightly packed, and I like it. From the outside it looks like a really nice apartment complex. We have a two community pools, tennis courts, and a basketball court. After we lived here a couple months I mentioned to Kim how I wish we had a little convenience store in the neighborhood, something simple, just the basics; a meat counter, fruits and veggies, dairy, and some very basic household items. I remember back to my childhood in Lindsay, California. We had a little corner store in our neighborhood, Linwood Market, that was exactly what I described. When we’d play baseball in the street on a hot summer day the group of boys I hung out with would often pool our money, look for bottles to redeem, and head to the market to buy a six pack of root beer or sarsaparilla to quench our thirst. Yes, very reminiscent of The Sandlot.
I think society could use more neighborly neighborhoods. Having a small, self sustaining, community could help, I think. If something like this sprung up in the valley I’d be very interested in checking it out.
Craig Hockenberry: “Work was proceeding at a very fast pace during the first week of January 2007. Beta releases were frequent and widely distributed. Fortunately, the folks at Twitter were using our app with itâ€™s snazzy new bird icon. One of our beta testers was an API engineer named Blaine Cook who sent me the following email:”
The word “tweet” was suggested by a Twitter engineer, but not used by Twitter for a very long time. The word was first used in Twitterrific. What a groundbreaking Twitter client. So much so people still think Ollie is the official Twitter logo. Whoops.
Guy English: “Thereâ€™s only one CPU socket and it bets heavily on the bus and GPU performance. While this looks to software to be just another Mac, it isnâ€™t. Itâ€™s capabilities arenâ€™t traditional. The CPU is a front end to a couple of very capable massively parallel processors at the end of a relatively fast bus. One of those GPUs isnâ€™t even hooked up to do graphics. I think thatâ€™s a serious tell. If you leverage your massively parallel GPU to run a computation that runs even one second and in that time you canâ€™t update your screen, thatâ€™s a problem. Have one GPU dedicated to rendering and a second available for serious computation and youâ€™ve got an architecture thatâ€™ll feel incredible to work with.”
At my day job I work on an SDK that allows people to embed video in their applications. The SDK lives on an awesome framework developed by our Systems team that is portable and allows us to create plugins that can process media and push it down the pipeline. That pipeline includes plugins to receive data from the network, decode that data, time it, and render it to a portion of a display. It can do this for live and recorded video, MPEG4, H.264, and even low frame rate JPEG video (so we don’t have to decode frames on the client.) But, I digress. If you notice, I mentioned decoding. We’ve looked at decoding with hardware but it’s actually quite expensive to push encoded frames across the bus, decode them, push them back across the bus, and finally render it, which pushes it back across the bus. Ick.
At one time Pelco had built its own combo card that could decode video and render to the display with a single push across the bus. That was a cool piece of hardware. At the time we could decode and display sixteen separate video streams simultaneously, at varying frame rates. That card was extremely underpowered. I guess what I’m getting at is this: How cool would it be to leverage one GPU on a Mac Pro for decoding all video, be it one stream or sixteen, and push the results across to the secondary GPU for rendering, without a transfer back across the bus to main memory? The idea of it seems very exciting.
Now all we need to do is build our pipeline for Mac OSX(totally doable) and create a new decode/render plugin that takes advantage of the new GPU. I’m not sure if its totally possible, without multiple bus transfers, but it would be fun to try.
John Gruber: “This made my day, but allow me to put my humble hat on for a moment, and praise two apps that foreshadowed iOS 7 long before Vesper”
Calling attention to two iOS UI pioneers was the right thing to do. It also shows that Mr. Gruber is a class act.
Thank you, John.
Ben Thompson: “My use of Paper is an essential part of stratechery, yet I needed to only pay $8.99 for two in-app purchases, for which I never need to pay again. Thatâ€™s a hell of a bargain, but itâ€™s ultimately unsustainable.”
It’s difficult to sustain a business on an app when you can only charge for the initial purchase. This is a problem Apple needs to solve on behalf of app developers.
Nick Bradbury: “My thanks to everyone who helped me keep FeedDemon going for so long â€“ when I created it in 2003, I donâ€™t think I wouldâ€™ve believed it will still be around 10 years later! Itâ€™s been truly fun working on it, and Iâ€™m sad to see it go.”
It’s hard to believe its been 10 years. Thanks for the great software, Nick.
Instagram Blog: “Weâ€™ve been able to do this with an incredibly small team â€” today just 35 people â€” who live and breathe Instagram, and weâ€™re looking for folks who are just as passionate to join us.”
Instagram has come a long way since they flipped the switch on their little photo sharing service in October 2010.
On Wednesday, October 6, 2010, Instagram launched its mobile photo sharing service for iPhone. In six hours, the back-end operation, which was running off a single machine in Los Angeles, was completely overwhelmed. – Mashable