Total Pageviews

Monday, October 27, 2014

It's not JUST about the money - it's your REPUTATION, too!

I'm posting this article from the New York Times because it directly addresses some of my concerns about H.S.I.'s - and the CCoC's - gathering of PPI (Personal Protected Information) about their tenants (who they refer to as homeless, even though they have leases and pay rent - or have rent paid for them through various subsidies including Section 8). They use a protocol and software referred to as HMIS, and report data to HUD and other government agencies in order to receive additional funding to pay for the supportive housing programs in S.R.O.s.  Who knows where else it goes on the way to the government agencies - or where it goes afterwards? H.S.I.'s Privacy Policy is only available and posted on their website; I've posted the document on this blog because I have informally asked tenants at Kenmore Hall if they've ever seen it or been made aware of it when they're talking to social workers on the second floor, and so far, I haven't met anyone who's seen or heard of it. I'm tired of hearing other tenants blindly repeating the phrase "It's all about the money" like sheep; of COURSE it's all about the money - H.S.I. wants to collect as much government funding as possible to provide social services to tenants as possible, or they wouldn't be able to cover their salaries or the cost of programming for tenants (rent collected from tenants goes primarily toward covering maintenance and costs associated with keeping the "physical plant" aspects of the building covered). BUT - it's also about MUCH MORE - it's about your PRIVACY and REPUTATION, TOO. If you're poor, your reputation is harder to protect - and it matters much more. If you don't care, keep talking to social workers and staff. According to their Privacy Policy, H.S.I. apparently feels no particular obligation to let you see your records - especially if they think you're gathering information while planning a lawsuit.

The data collected by H.S.I.'s social workers whenever tenants talk to them may promote a false profile of each and every tenant living in their buildings; the profile includes being chronically homeless (remember, they call their tenants who are NO LONGER HOMELESS, HOMELESS even though there's a logical contradiction in terms once someone signs a lease and starts paying rent - or having it paid for them by participating in an appropriate program) or MICA (Mentally Ill, Chemically Abusing). Not all of us are chronically homeless, and many of us are NOT mentally ill, drunk or high. Who knows how long the profile will be attached to tenants, or who/what agencies it's actually being shared with, or how long the profile will be kept on file with other agencies? Who knows how much of the information identifies specific people?
I'm bringing this up on the blog because many of the very people who complain about having the social workers aggressively pursue them to participate in H.S.I.'s program don't seem to get the full implications of what's going on. It's annoying to have to connect the dots for people who don't seem naive or gullible, but who keep asking the same questions over and over. I'd like to give people credit for having the intelligence I assume they have based on their age and potential wisdom gained through life experience - but I've been repeatedly providing the same information in print and here in the virtual "blog-o-sphere", which is this: Tenants in S.R.O.s who have rent stabilized leases, like we do at Kenmore Hall, have plenty of rights. There are responsibilities that go along with those rights, too; on a practical level, you have to comply with what's in your lease, and make sure your rent is paid. You're NOT, however, obliged to cooperate with anything that's NOT in your lease, including the social services provided by a supportive housing program. This isn't a jail, shelter, or nursing home. READ YOUR LEASE: not just the most current one, but the ORIGINAL one. And think carefully about how you want social workers, who have an agenda that may not match your own, to represent you to other organizations. The potential damage to your reputation could be significant and long-lasting.

By the way, many of us may already be on the Tenant Blacklist referred to in an earlier post to this blog (scroll down). Anyone that's been through housing court has a good chance of landing on that list - and it's just one particularly juicy example of how personal data gets sold and works to the DIS-advantage of the people included in the list.

The Dark Market for Personal Data
By FRANK PASQUALE OCT. 16, 2014
Inside

    Photo

    Credit Sam Potts
    Continue reading the main story
    BALTIMORE MD- THE reputation business is exploding. Having eroded privacy for decades, shady, poorly regulated data miners, brokers and resellers have now taken creepy classification to a whole new level. They have created lists of victims of sexual assault, and lists of people with sexually transmitted diseases. Lists of people who have Alzheimer's, dementia and AIDS. Lists of the impotent and the depressed.
    There are lists of 'impulse buyers.' Lists of suckers: gullible consumers who have shown that they are susceptible to 'vulnerability-based marketing.' And lists of those deemed commercially undesirable because they live in or near trailer parks or nursing homes. Not to mention lists of people who have been accused of wrongdoing, even if they were not charged or convicted.
    Typically sold at a few cents per name, the lists don't have to be particularly reliable to attract eager buyers '” mostly marketers, but also, increasingly, financial institutions vetting customers to guard against fraud, and employers screening potential hires.
    There are three problems with these lists. First, they are often inaccurate. For example, as The Washington Postreported, an Arkansas woman found her credit history and job prospects wrecked after she was mistakenly listed as a methamphetamine dealer. It took her years to clear her name and find a job.
    Second, even when the information is accurate, many of the lists have no business being in the hands of retailers, bosses or banks. Having a medical condition, or having been a victim of a crime, is simply not relevant to most employment or credit decisions.
    Third, people aren't told they are on these lists, so they have no opportunity to correct bad information. The Arkansas woman found out about the inaccurate report only when she was denied a job. She was one of the rare ones.
    'Data-driven' hiring practices are under increasing scrutiny, because the data may be a proxy for race, class or disability. For example, in 2011, CVS settled a charge of disability discrimination after a job applicant challenged a personality test that probed mental health issues. But if an employer were to secretly use lists based on inferences about mental health, it would be nearly impossible for an affected applicant to find out what was going on. Secrecy is discrimination's best friend: Unknown unfairness can never be detected, let alone corrected.
    These problems can't be solved with existing law. The Federal Trade Commission has strained to understand personal data markets '” a $156-billion-a-year industry '” and it can't find out where the data brokers get their information, and whom they sell it to. Hiding behind a veil of trade secrecy, most refuse to divulge this vital information.
    The market in personal information offers little incentive for accuracy; it matters little to list-buyers whether every entry is accurate '” they need only a certain threshold percentage of 'hits' to improve their targeting. But to individuals wrongly included on derogatory lists, the harm to their reputation is great.
    The World Privacy Forum, a research and advocacy organization, estimates that there are about 4,000 data brokers. They range from giants like Acxiom, a publicly traded company that helps marketers target consumer segments, to boutiques like Paramount Lists, which has compiled lists of addicts and debtors. Companies like these vacuum up data from just about any source imaginable: consumer health websites, payday lenders, online surveys, warranty registrations, Internet sweepstakes, loyalty-card data from retailers, charities' donor lists, magazine subscription lists, and information from public records.
    It's unrealistic to expect individuals to inquire, broker by broker, about their files. Instead, we need to require brokers to make targeted disclosures to consumers. Uncovering problems in Big Data (or decision models based on that data) should not be a burden we expect individuals to solve on their own.
    Privacy protections in other areas of the law can and should be extended to cover consumer data. The Health Insurance Portability and Accountability Act, or Hipaa, obliges doctors and hospitals to give patients access to their records. The Fair Credit Reporting Act gives loan and job applicants, among others, a right to access, correct and annotate files maintained by credit reporting agencies.
    It is time to modernize these laws by applying them to all companies that peddle sensitive personal information. If the laws cover only a narrow range of entities, they may as well be dead letters. For example, protections in Hipaa don't govern the 'health profiles' that are compiled and traded by data brokers, which can learn a great deal about our health even without access to medical records.
    Congress should require data brokers to register with the Federal Trade Commission, and allow individuals to request immediate notification once they have been placed on lists that contain sensitive data. Reputable data brokers will want to respond to good-faith complaints, to make their lists more accurate. Plaintiffs' lawyers could use defamation law to hold recalcitrant firms accountable.
    We need regulation to help consumers recognize the perils of the new information landscape without being overwhelmed with data. The right to be notified about the use of one's data and the right to challenge and correct errors is fundamental. Without these protections, we'll continue to be judged by a big-data Star Chamber of unaccountable decision makers using questionable sources.



    Frank Pasquale, a professor of law at the University of Maryland, is the author of the forthcoming book “The Black Box Society: The Secret Algorithms That Control Money and Information.”
    A version of this op-ed appears in print on October 17, 2014, on page A31 of the New York edition with the headline: The Dark Market for Personal Data. Order Reprints|Today's Paper|Subscribe


    No comments:

    Post a Comment

    Opinions expressed in comments may not be shared by moderator, but we try to respect diverse opinions