Technology / Phone / Messaging
November 30, 2011
Who Stewards the Personal Data Question? Org Chart
Below is a diagram showing the non-profit organizations (note: no for-profits, conferences or governmental orgs were included) that are stewarding pieces of the Personal Data Ecosystem. I wanted to show how the orgs are relating to the problem of how to remake our digital lives, through more user-driven personal data, for more equal transactions throughout our lives with companies, the online world, and our government.
The orgs have been divided into four areas: technical, market, policy and individual advocates. While all the orgs have an interest and are doing some thinking in all the areas, these divisions show the foundational mission of the orgs. If each org, through its foundation mission, succeeded, they would be heros for sure. The problem is, mission creep. This is a problem for startups as well, where companies don't focus and get their piece right to succeed, but rather think competitively and try to take too many pieces of the market, leading to failure. So too will the large number of problems, plus mission creep, cause any of these orgs to fail at their mission.
Ideally, we'll see all the orgs working together in inter-disciplinary and multi-disciplinary ways, relating each of their solutions to the others, but keeping focused and executing their piece of this vast and Byzantine puzzle to solve the Personal Data Ecosystem. In creating this "org chart" I talked with folks like Kevin Marks of Microformats and Activity Streams, Harry Halpin of the Federated Social Web, Scott David, Don Thibeau of OIX and OpenID, Drummond Reed (who has worked with OASIS extensively), Doc Searls of VRM, Craig Burton, Steve Rappetti and Phil Wolff of Data Portability project, Dazza Greenwood of ID Cubed, Judi Clark and Joe Andrieu of Information Sharing Working Group, among others.
So here is a picture of who is doing what in the Personal Data space:
Below is more information on these organizations.
Customer Commons -- recently formed by Doc and Joyce Searls, Renee Lloyd, Joe Andrieu, Dean Landsman, Markus Sabadello, Judi Clark, Iain Henderson, Craig Burton, and me, as well as a few others in the room that, I apologize, I'm forgetting. Customer Commons' mission is: a community of customers, funded only by customers, serving the interests and aspirations of customers.
Personal Data Ecosystem Consortium -- is a trade association for startups and big companies that agree to a set of principles for user-driven personal data. 19 companies (currently) have joined, and PDEC's mission is to support market solutions to the personal data question. Kaliya Hamlin is Executive Director and I am Chair of the Board.
PDEC also has just formed a Legal Town Hall, a monthly call starting January 11, 2012, to be led by Judi Clark, to talk about what kind of policies are needed when individuals share their data.
World Economic Forum -- WEF has been working with lots of early thinkers in the Personal Data space for the past 18 months to "rethink personal data." They put out a report: Personal Data: a New Asset Class last February and continue to have monthly calls to prepare for a presentation of the working groups' efforts at Davos in January.
Project VRM -- Vendor Relationships Management, the brainchild of Doc Searls created during his fellowship at the Berkman Center, is a discussion group with a very active maillist, a movement for user-driven relationships with entities, and a steward of developers coding to bear out the group's vision.
OIX: Open Identity Exchange -- Don Thibeau is Chair of their Board, and Scott David is their counsel. OIX's mission is to build trust in the exchange of identity credentials online. They do this through the open, standardization of Trust Frameworks. They don't make trust frameworks, but rather their mission is to be the home of other's trust frameworks for the sharing of personal data, login credentials, and other types of private or controlled information. For example, the company Drummond Reed co-founded, Respect Trust Framework at OIX, who publishes it for others to point to as a public declaration of the trust framework. And, the U.S. FICAM Trust Framework was the first open identity trust framework to be listed by OIX
Information Sharing Working Group -- From the ISWG: The ISWG works with the Kantara Initiative, Identity Commons, Project VRM, the Personal Data Ecosystem Consortium, and Customer Commons. Run by co-chairs, Joe Andrieu and Iain Henderson and secretary Judi Clark, ISWG's formal mission is "to identify and document the use cases and scenarios that illustrate the various sub-sets of user driven information, the benefits therein, and specify the policy and technology enablers that should be put in place to enable this information to flow."
The Information Sharing Work Group helps individuals take control of the information we share online. The Standard Information Sharing Agreement is a contract for the use of your information, agreed to BEFORE you share it. It has two parts. A basic agreement covers all the default terms, things like “don’t redistribute my information without my permission”, which all recipients agree to. Then, for each individual instance of sharing, a data transaction agreement with just the bare essentials: who gets what data for what purpose. By moving all the complicated legalese into the basic agreement, we’ve dramatically simplified each specific transaction agreement.
Now, when you want to know what’s happening with your data, it’s presented simply and concisely in easy-to-understand terms… while the basic agreement defines how recipients must treat your data appropriately. The Sharing Agreement is designed to make it easy to understand and make informed decisions about sharing information online.
ID Cubed (ID3) -- a newly formed research and developement group affiliated with MIT and led by John Clippinger, Executive Director and CEO, (who started the Law Lab at Berkman/Harvard a couple of years ago and the Social Physics project a couple of years before that, also at Berkman) and Henrik Sandell, COO and CTO of ID3. ID3's mission is to "oversee the development of a multi-disciplinary center founded to research the role of law in facilitating cooperation and entrepreneurial innovation." Their major focus based upon the website seems to be Trust Framework development. Dazza Greenwood is also involved, as is Mike Schwartz of Gluu is doing some technical work for them.
Data Portability Project -- "Aims to consult, design, educate and advocate interoperable data portability to users, developers and vendors." They don't make standards but they help steward them to support more data portability, including protocols like OpenID, OAuth, RSS, Microformats and RDF among others. Steve Repetti is their Chair and Phil Wolff is very active as a public speaker for them. Here is some additional information about their mission.
Federated Social Web -- has recently become a working group of W3C, and is stewarded by many including Evan Prodromou and Harry Halpin. FSW is stewarding work on federated social web software and protocols, including things like PubSubHubBub, OpenID, Activity Streams, OAuth, among many protocols.
Activity Streams -- developed a protocol for how user's share personal data, using both JSON and Atom based streams of metadata. Monica Wilkinson and Kevin Marks actively steward the project. Activity Streams works on the Microformats model, proposing standards around activities already heaving in used online.
Microformats -- Microformats have been created for many pieces of data shared, such as hcard or hcalendar. Stewards of this project include Tantek Celik and Kevin Marks.
OpenID -- Created protocol for a federated login with OpenID 2.0 spec. OpenID Foundation is currently working with Microsoft, Google and Facebook on OpenID Connect, as well as on Account Chooser, an open standard for web sign-in ease switching between multiple accounts on a website. OpenID Foundation's chair is Don Thibeau.
ID Trust, OASIS -- from their website: "...promotes greater understanding and adoption of standards-based identity and trusted infrastructure technologies, policies, and practices. The group provides a neutral setting where government agencies, companies, research institutes, and individuals work together to advance the use of trusted infrastructures, including the Public Key Infrastructure (PKI)."
XDI.org -- responsible for the XRI / XDI standard, currently for pointing to data and creating link contracts. From their website: "XDI.ORG is an international non-profit public trust organization governing open public XRI and XDI infrastructure. XRI (Extensible Resource Identifier) and XDI (XRI Data Interchange) are open standards for digital identity addressing and trusted data sharing developed at OASIS, the leading XML e-business standards body. XRI and XDI infrastructure enables individuals and organizations to establish persistent, privacy-protected Internet identities and form long-term, trusted peer-to-peer data sharing relationships." Drummond Reed co-chaired the group with well, Gabe Wachob, of the XRI TC at OASIS and Andy Dale, Markus Sabadello, Mike Schwartz we involved in developing the standard.
W3C -- Umbrella standards body stewarding a number of standards for personal data use and control including the Do Not Track proposal. The Federated Social Web, and all their combined efforts including Activity Streams, recently landed at W3C.
ITU (International Telecommunications Unit) -- making infocommunications standards since 1865. Yes.. that's really 1865.
User Managed Access (UMA), a Kantara working group -- develops specs to allow individuals to "control the authorization of data sharing and service access made between online services on the individual's behalf, and to facilitate interoperable implementations of the specs." UMA group chair is Eve Maler.
The Direct Project -- From their website: "The Direct Project specifies a simple, secure, scalable, standards-based way for participants to send authenticated, encrypted health information directly to known, trusted recipients over the Internet."
IETF (Internet Engineering Task Force) -- Working on a number of standards around identity and data portability.
Claims Agent Working Group -- is working on development of standards-based, interoperable, verified claims agent implementations. Is at IDCommons and was originally proposed by Paul Trevithick, though many people are part of the group.
Open Web Foundation -- is "independent non-profit dedicated to the development and protection of open, non-proprietary specifications for web technologies" and uses an open source model similar to the Apache Foundation. Their leadership includes Tantek Celik, Chris Messina & David Recordon.
Update: I've added the following item to technical:
SWIFT -- a non-profit based in Brussels that provides messaging standards around banking wires, is proposing a new infrastructure layer called the "Digital Asset Grid." The DAG would provide the metadata for all data transactions (including personal data), not just money wires, as well as a hardened, full duplex transaction layer for security, flexible identity and certified data. (Full disclosure, I'm on the team that proposed the Digital Asset Grid to SWIFT).
If you have more information about these groups, people involved, or corrections, please leave them in the comments and I'll update the post. Thanks!
March 19, 2009
The Life of a Tweet
Twitter (and the ISchool -- or one of my poor brethern -- I have a masters from UCBerkeley's iSchool) seem to be in the tweetsphere over one ill-found tweet tossed off by a student and found by her summer internship employer likely via search.twitter.com. For background, you can see this: FattyCisco.com. The poor girl is likely humiliated and horrified over what she thought was an innocent and also, likely, a fleeting thought that didn't really reflect how she felt overall.
We've all had those momentary thoughts where when we are ambivalent, we toss something out of our mouths and once it's out there, we think, wow, that doesn't even ring true or, it did for a nanosecond, and now it's changed, or gee, that's about 5% of the way I actually feel about this. But out of mouth, truly ephemeral (unless recorded in some form) is different than written down and searchable in the grand database of the Googlezon and search at twitter. Or maybe it's just a joke.
This is one of the problems with online communities and specifically twitter:
You don't know who's listening, and because of search tools, you are findable beyond your follower list
or your "community" of known tweeters (ppl you @ with or read) unless your account is private.
I don't think we have at all sussed out what it means to tweet in the long term, or what the power of the tweet is, or where the tweet goes and what sort of life it has beyond the first few minutes or hours of it's life in the Twitter / client context.
This is another example of something that happened recently:
A PR exec going to Memphis to meet with a client, Fed Ex, insulted the client on the way to the meeting. The clients wrote a letter to the PR company and him, his bosses, and cc'd everyone at Fed Ex as well. Ooops.
The problem is, tweets go to those paying attention at the moment, those who may save tweets in clients (i leave my twitter client open and check it now and then as I have time -- right now I have 15k tweets from the past couple of days), those pivoting on a single user, those searching for key words, those looking a related conversations.
But when you tweet, in your head, you're often just thinking about those you expect to read it, like only a few your followers paying attention at the time. What happens with some tweets (some reading by some followers) is not what can happen with all tweets.
The interface and interaction at Twitter's website doesn't lead you to believe that what happens most often there will happen in incendiary examples. And different twitter clients (an android or Iphone app for example) don't lead you to understand the permanent nature of tweets, through use, that say, search.twitter.com might, as you see something you deleted appear there anyway.
It takes experience with all these different modalities to inform you because there is no advance disclosure or warning of the elasticity of a single tweet.
What is most interesting is this pushes me to think harder about what the interface of "aged information" online looks like (and I don't mean google search results that move from page 1 to page 3 over time).
And I have to ask myself what it would mean to have what Judith Donath discussed on the panel, Is Privacy Dead or Just Very Confused, moderated Saturday at SXSW by danah boyd. Judith discussed having some kind of a "mirror" for you of your digital self that would reflect all your online presentation and communications and expression... just so you might get a sense of what you show people and what you project at a moment in time. Right now it's really hard to gather that sense of yourself. Right now, you don't really see it in any sort of complete way. But others see pieces of you digitally represented at different times. It would be like re-disclosing for yourself what you've done, discovering how others view you, in slices or on the whole, in order to see the effect you have. It would probably be helpful to know what had reach and where, and what was for now at least, forgotten.
But frankly, the privacy implications of that are huge as well. So, I'm thinking. No answers on that one yet.
January 27, 2009
Hey.. She's Geeky is a few days away, and you can still sign up.
The list of great women attending is here: She's Geeky Attendees and Registration
Really looking forward to interacting with all those awesome girl geeks on Friday and Saturday at the Computer History Museum in Mountain View!
April 17, 2008
FCC Hearing at Stanford Today
I can't go, but I hope lots of folks out there who support and open and free internet do. Here's the schedule according to Save The Internet:
It is rare for all five members of the Federal Communications Commission to leave Washington, D.C., and they want to hear from you. There will be a public comment period - come speak up to save the Internet!
WHAT: Public Hearing on the Future of the Internet
WHEN: Thursday, April 17
TIME: 12:00 p.m. to 7:00 p.m.
WHERE: Dinkelspiel Auditorium, Stanford University
(471 Lagunita Drive, Palo Alto, CA) Map It!
For directions and travel information, visit: http://www.savetheinternet.com/=stanford_travel
FCC Public Hearing Agenda
12:00 p.m. - Welcome/Opening Remarks
12:45 p.m. - Panel 1: Network Management and Consumer Expectations
3:00 p.m. - Panel 2: Consumer Access to Emerging Internet Technologies and Applications
4:30 p.m. - Public Comment
6:30 p.m. - Closing Remarks
7:00 p.m. - Adjournment
Note also that Comcast is proposing a "P2P Bill of Rights and Responsibilities" according to ArsTechnica, who is skeptical. Don't see any users in that room, but if they don't invite us, I'd guess after Boston, we'd all get pretty mad and force them to include us. Either way, (FCC or voluntary code) I think it's going to be user centric in the end. We're just going to have to fight like hell.
Kevin Marks also makes a great point about Comcast: They are like The Producers who oversold their Broadway show, assuming it would fail, by getting 100 people to buy 10% of the who. Comcast, by overselling their network for internet access is doing the same, and then having secret levels above which they cut people off out of the blue, is pretty bad.
January 18, 2008
The FAA TRACON Information Experience Live
Earlier today I had the delightful experience of touring the FAA's Northern California TRACON facility.
Basically, TRACON, which stands for terminal radar approach control, is the air traffic control center which, in this case, handles Northern California. TRACON handles traffic outside of each local control tower a plane might ultimately deal with as it lands. There are TRACONs all over the US for other regions. We weren't allowed to bring in cameras so I'll instead show you a news photo from SF gate that was representative of what we saw up on the wall of the facility. You get the idea there of what they are seeing on some of their screens.
This photo only shows traffic into SF, because it's a visualization from SFO traffic control, but just imagine more planes going into San Jose, Sacramento, and other smaller airports like Modesto. Also, these screens are synced between TRACON and the air traffic controllers who are local. And if anything happened to one TRACON, others would instantly fill in, as the system works somewhat like the internet in that sense.
TRACON is housed in a big, windowless building, extremely modern and cool with an air of serious importance about it (I always find that at say, buildings in Washington DC, and I kind of like it even if they do take whatever it is they do a bit too seriously sometimes). Our tour guide, a woman who is a trainer for other air traffic controllers, at one point said, "You have 10 seconds or so to make contract with a plane and move on. If you screw up, there are hundreds of lives on the line." That's pretty serious.
TRACON's building is basically an octopus design, where each leg has 20 or so terminals with about 10 people in each, manning a particular physical area (like planes coming into Sacramento) in order to follow planes as they enter the region first. All commercial flights must fly IFR -- Instrument Flight Rules -- which means they have to be in contact with TRACON, in case they can't see or there is bad weather, or there is simply a pile up of planes that need to be moderated into an airport. Planes that fly VFR -- Visual Flight Rules -- don't have to contact TRACON, but some do anyway for a variety of reasons. TRACON has longer range radar than the local air controllers, but the longer range radar updates more slowly. So that is the trade-off between regional (TRACON) and local control.
Once TRACON has the plane logged, they make a little block of data on their screens (a different type of screen than the one shown above) that shows the flight number, its altitude, and other information that will help them keep planes apart, on track and moderated as they reach the range of the local control towers who then take over moderating the planes.
In the cycle of life for a controller (who has to quit at age 56 and can not be considered after age 31 to start training), they typically have military training or attend a special school after college, and then are trained at the local site. Our host said that for the first few years (maybe up to 10) controllers are pretty tense on the job, but after 10 years they relax some. She said the most dangerous situations come when people are relaxed, and less is going on around them, rather than more. That's when mistakes are made.
Another thing our host said was that they have to keep the chit chat down, because if there is an accident, they don't want to have some controller chatting away on the transcript, just before it happens. They are pretty businesslike when talking to pilots. She talked pretty fast, she said, due to the edgy situation needed to quickly regulate the flow and placement of all the different planes they are watching, and that's how she trains people. I know from riding in a friend's plane frequently where I can listen to lots of this talk, that they are pretty succinct, and yet both pilots and controllers have a kind of cultural humor that is pretty funny, in those few words they exchange, and this allows some kind of personality to come through often. If you want to check out what happens, here are some example live sound feeds from a bunch of different air control areas.
So.. what were the information systems like? Well, I thought they were fascinating. The premise in building, training for and using them is very different than say, the web based systems I typically work on in my day to day life. In fact in many ways, they had the exact opposite goals and metaphors I use to build systems and interfaces. First, they train their people between 6 months and 5 years on these system -- but our guide said 2-5 years is typical.
Think about that. Training your user for 2 years. What would that mean to interface architecture and design? You could certainly do a lot different with it than what we do now on the web.
Their top menu, interestingly, is literally a series of very-1993 buttons, big squares, in rows, maybe 8 across and 12 down, though all those gorgeous 22 inch screens are touch screens. Each controller has two of them, not horizontally placed, but vertically, in the workspace. Some of those buttons go to pages that help track planes, but I did note one, placed furthest away from the user's sitting position, for that day's cafe menu. It appeared that all possible items were options at the top level. Nothing appeared to be pushed back to a lower level or made less important or secondary in the interface other than two items described below.
When you go into the main menu items, there is little to cue you back, and in fact many of the screens were missing back buttons. Some had them and some didn't. But with that much training before you can even get into a real working station, it doesn't seem to really matter. You know the system inside and out, as well as how and what to do with it and all the planes you have to manage (typically 10 - 20 at one time).
A lot of information is stored in the user's head, and as new plane info comes up, only the abbreviation or shorthand block code describing the plane is on the screen along with various map-based data to place the plane. This means that instead of giving lots of data on one plane on the screen, the data is offloaded to the user and the screen just has the shorthand.
That shorthand for a plane is shown in the middle screen (below the menu in the top screen), which has the map with blocks of data representing planes. Their systems look much like map systems we use online in a way but with way cooler visualizations because they have radar and more info about airspace restrictions and well.. I don't know any web service that has radar. Imagine "Google Radar" overlaid on Google maps? That would be a cool product launch.
So in other words, what the information systems metaphor seemed to be was the exact opposite of what we do in web systems: TRACON systems are built with high mental overhead -- you have to know a lot to use and understand both sets of systems before you start to navigate because nothing in those buttons really helps you know what is below, other than the word on top. During actual use, when you enter and track planes, you get that overhead in the years of training you do before you can operate the system in play. The information systems below those button also have little style that would take any one piece of information and make it more important than any other on the same screen. Information is chunked or grouped a little on those secondary pages, but that's it. So there is no expectation that anything is pushed back or pushed forward, other than the menu, where each little button represents a page/function, and each page has the function represented.
Instead of the software deciding what is most important at the moment of use, and emphasizing it in some formated way, the user just has all of it equally represented and therefore has to decide what's necessary or relevant. In some cases, there was a mini system below a secondary page via a link, to find backup documentation on a plane (if the controller asked the plane to do something, and the plane wasn't built for it, they could check the specs on the plane) or on a small airport (to get backup data on landing strips and landing directions). But these seemed to be relatively rare use cases that allowed the backup information to be lower down to a third level.
Our other tour guide, a man who'd checked us in, did an introduction presentation in power point to explain the basics, and then finished up at the end. He told a couple of stories like what happened on 9/11. He said they grounded every plane everywhere coming and going anywhere in the country. It was eerie, because all their screens (which we were seeing, depending on scope, with somewhere between 20 and thousands of planes) were almost completely empty. Black. With little white map lines showing various air, altitude or other restrictions and weather. They spent three days watching military jets fly around, and that was it. Nothing else.
My take on this sort of system was that it could stand visual and architectural improvement, but that without a lot of study and planning, it would be dangerous to change it. And, the users are so adept at the system they have now, and have so much responsibility and pressure to perform quickly, that changes would likely be unwelcome. Extensive study of user behavior and needs would have to happen, and then extensive testing would have to follow before anything could be put into practice. I can see why they maintain the same system (it's not from 1993 though.. it's much more recent), and just update it with new air space data and plane info, and don't do much to mess with a working system.
But it was still fascinating to see the TRACON information and understand the motivations for its construction and use. And comparing that to what we do building web systems? The best!
October 28, 2007
Fiber Optics in Sherborn Massachusetts
I'm visiting with some friends in Sherborn Massachusetts. They previously had dial up internet access, but sometime in the last two years, everyone (3,000) in this town, as well as more surrounding towns, got fiber optic lines put in by Verizon.
They have 5 mbs of downstream service for $35 a month, and if they pay $7 more per month, they can get 15mbs. It's rocket fast, so fast, as my host says, "it's too fast to take advantage of much besides video and VOIP because no one else has a fast connection to talk that fast with you." But it still rocks.
Everywhere I go in the Bay Area, work, home, friends offices, public places.. I wait for every website, video, voip connection, etc that I use. It's just amazing the contrast here. And every window I look through in my host's house has gorgeous forest and fall colors .. it's at least 100 yards to the next house., and all the houses here have that sort of spread. How do they do it when we can't get this in the denseness of Berkeley, San Francisco, Mountain View?
I'm sure the telcos that took $200 billion from the FCC and then didn't install fiber optic service have some excuse, but it's BS. They just need to install it since we paid for it, and then we can all move on.
October 24, 2007
James Cicconi of AT&T On Net Netrality
James Cicconi, Senior Executive VP Legislative and External Affairs for AT&T was at Esme Vos' Muniwireless conference yesterday, spewing what I would kindly call the greatest of spin, and unkindly as BS.
Net Neutrality is not about people telling network providers what to charge for tiered service. That's bull. Net Neutrality says that video packets, no matter where they come from, will get through at the same rates. Same with text or photos or VOIP or anything else. The network can't under Net Neutrality distinguish and discriminate because it doesn't like where something came from or the place the packet came from didn't pay the telco's any money to prioritize the packet.
To quote muniwireless (emphasis is mine):
It's Day 2 of the Muniwireless Silicon Valley Conference and they have an executive from AT&T talking about municipal wireless networks.
AT&T has not changed its tune. It is still against cities using public funds to compete with private enterprise and believes that communications should be left up to private firms like AT&T.
James Cicconi, Senior Executive VP Legislative and External Affairs for AT&T claims that there is no duopoly and there is enough competition in the market for telecommunications services, so cities should stay out.
What is AT&T's position on net neutrality?
Net neutrality is a challenge for all companies. You spend billions to deploy your assets and net neutrality means someone telling you what you can do with your assets - what you can charge, tiers of service, etc.
"All bits should be treated equal" is a problem for network engineers because one bit is porn another bit is heart surgery, another is email, yet another is voice, another is spam. That everything should be moved equally end to end is ludicrous. It's a more costly way to do things. It's not efficient, according to AT&T.
AT&T cannot build and maintain assets quickly enough to meet the demand. They are spending $19 billion this year. Some of the demand is driven by video. What happens when people start delivering high definition film? They can't build networks fast enough! What's the answer? Effective traffic management.
The antitrust laws can deal with the problems of net neutrality (side note: unfortunately these are not being enforced today). Why should AT&T want to degrade traffic? They will go to someone else (side note again: in a duopoly, you've got Comcast which has been blocking Bittorent traffic).
I don't know about you but where I live and work, we have two choices: AT&T for dsl or Comcast for cable internet access. They are both Mid-band services, and not great but better than dialup. And we pay exorbitantly for them compared to other countries.
So of course they want to take their AT&T/Comcast duopoly and spin Net Neutrality as being all about people interfering with their pricing models for tiered service when it's really all about prioritizing packets. They want to divert attention from the reality which is that they want to put their videos through first, their media, their VOIP or media/VOIP from people who've paid them off. Instead of letting users have what they want. The telco's want to own the pipes and the content.
It's wrong and we can't let the telcos win on this.
September 06, 2007
Apple Vacation Email Gone Mad
This past weekend I was gone for three days and decided (after past goading by others who said I really should use the vacation email system in my Mac Mail - and they are mac-heads so I figured they knew) to turn it on.
Well. I choose to only use it for my work email, and assumed it would be smart enough to:
1. send vacation email to any email arriving AFTER the thing was turned on
2. only send one per email address
3. no maillist replies and no mail in other folders (even if segregated during the period, should be touched -- which would include spam in the "junk" box)
What a disaster. People got a vacation email sent in reply to every email they'd ever sent me, including mail lists, spam, anything in any other box that I'd dragged over.
Not only did Mac Mail ignore the date, but it sent 5, or 10 or however many email I've exchanged with someone, back to them for messages that in some cases were years old.
I'm so sorry, if you got spammed. And if you didn't, well, you were lucky. All I can say is, what the hell is Apple thinking with this, and what were those people who told me to use it thinking? It's 2007.
This was all so amateurish circa 1997. I really couldn't believe it when I got home and saw what had happened.
I'll never use vacation email again. This is why I avoided it in the past. Because my experience with it from others has not been good. And now I'm one of those vaca spammers who are so annoying. Yuck.
May 22, 2007
US Internet Speeds are Really Slow..
Via Dave Farber's IP list from Press Etc:
Average broadband download speed in the US is 1.9 Mbps. It is 61 Mbps in Japan, 45 Mbps in South Korea, 18 Mbps in Sweden, 17 Mpbs in France, and 7 Mbps in Canada.
I've talked about this before.
Americans are falling further and further behind, in socializing with technologies like high speed interent access as well as cell phone tools and service that are much more dynamic than the rest of the world has. This is due to terrible public policies around these technologies and selfish companies who provide the services in monopolisitic ways.
Two to four years after I first talked about this, we are further behind than ever. It's appalling but you can read about the $200 billion scam on the US by Verizon, QWest and the Bell companies here.
March 16, 2007
Just When I Thought Cingular Couldn't Get More Evil...
So, last month, after my Cingular bill was three days late (3 days!) they sent out a notice to me to pay up or get cut off. Since I've been a customer for years, and the bill was only for my flat rate service (no additional phone charges), I thought this was kind of draconian when I received the notice, 8 days after the bill was due. But I immediately hopped onto my online billpay service, paid the full bill plus $20 and left it at that. This was on a Wednesday. The following Monday, with no messages from Cingular on my phone, my service was cut off. I called them to ask why (of course, the phone was not able to call them.. they forced me to use a landline. Thanks guys, it's not like you're the phone company or anything.)
Anyway, while I was onhold, I looked at the previous month's bank statement, which showed I'd done online billpay on the 3rd, and then the Cingular bill which showed they'd receved the payment on the 5th of the month. When the person came online, I asked her about my phone being disconnnected, and the fact that the bill was only a few days overdue. She said they hadn't received my payment and they could do that. I said, "How is that? I just sent online bill pay money the day I received your notice. There is no time under this schedule to even get it into you? But last month you got it in two days. And my bank just told me online at their side, that you received the current payment." She turned the phone back on, but not before saying that online billpay typically takes a week or more. But I pointed out that between me and Cingular, it's taking two days and what sort of games are they playing anyway?
They don't even give you enough time to act after receiving the notice. My notice was dated the 5th, I got it in the mail on the 8th and my bank says they had the money on the 10th. I think this is all about generating $36 restoration fees for those who are forgetful as I am, working and traveling so much right now. And it's not like I have some giant balance. I did pay late.. but it's the same amount every month and I always pay and before I've been a very regular on-time payer.
So.. I still don't have my latest bill, to see how long the last online bill pay took compared to when I sent it, the 8th of last month. But.. get this. I received another notice this month, dated the day after the bill was due. I've already paid it. But geez. What a bunch of jerks. I mean, charge a late fee or something. Wait two months before you act like you hate your customers.
The other thing was that on my most recent bill, there were $30 in charges for some ridiculous spam text messages. I didn't ask for these messages, and so I called Cingular to have them removed. And I asked that before anyone can "charge me without asking" that Cingular force them to have to ask me first, before texting me twice, for $30. Cingular said they couldn't do that. I said they do it for collect calls, so why not "collect text messages?" They said they could only remove text messaging as a feature, which I need and use all the time. So my choices are, either be open to getting slammed by evil text messaging companies for very high fees, or don't have text messaging. Yikes. Way to protect your customers, Cingular.
Oh.. but wait. Maybe Cingular gets a part of the $30 fee, and therefore doesn't want to shut down this bad behavior?
Then, on Dave Farber's IP list, I found out that Cingular, Qwest, AT&T and Sprint are blocking calls to FreeConference.com. GigaOm also did some investigation that's interesting. That's because they don't want us to use our cellphones to call a long distance number, to get free conference calling.
OMG. I mean really. I pay Cingular approximately $.10 a minute during the "anytime" calling hours, and that's when I use Free Conference. Who are they to block me from dialing, not an 800 number, but a regular charge call? That they make money from?
Cingular == evil. I would do anything to get off the stupid telco rollercoster that seems to be routed perpetually downward and just use VOIP with an internet connection on my cell phone.