Archive for the ‘General’ Category

Upgrading to Windows 10

2016/07/15

The first good news is that the upgrade from Windows 8 (which I hated with a passion) to Windows 10 went very smoothly. Now the PC on Windows 8 is my daughter’s PC and it had few applications. In fact the only problem I had was with a DVD player, which was quickly solved by downloading a new version of the player.

The second good news is that a “simple” upgrade from Windows 7 to Windows 10 went well also. I had to delete Chrome and OpenOffice, and then reinstall their Windows 10 versions.

The upgrades were slow, even with a good cable modem, but it all worked. I was delighted that the upgrades restarted themselves intelligently each time when the network burped.

Advertisements

The Leaky Leviathan

2015/06/21

David Pozen’s Harvard Law Review paper [1] “The Leaky Leviathan: Why the Government Condems and Condones Unlawful Disclosures of Information” explores the issues of leaks, plants, and combinations “pleaks” of confidential government information. Such disclosures represent an art form for U.S. (and I’m sure other nation) politicians. This paper will be required reading for any student of constitutional law or participant in government. For technologists such as readers of this blog, this paper begins the wrestling match around the Snowden leaks, and the battle over the NSA’s previously secret activities to intercept foreign and domestic Internet and telephone communications.

Pozen’s paper analyzes the reality of how government works. It is easy to make something “secret” in one form or another, but government has created informal ways to incrementally relax secrecy. We’ve all heard newscasters attribute a story to an “unnamed source”. When the story supports the government position in a measured way, it is a plant. When the government feels too much pain and the unnamed source was not controlled by the government, then it is a leak, and top executives in the government squeal like little piglets at the wrong done. In reality, Pozen writes, with tongue-in-cheek, plants need to be “nourished” by leaks. Otherwise, if all leaks were suppressed, plants would lose their believability and would be ineffective as a government tool. He points out that historically whistle-blowers and sources of leaks are rarely thrown in jail. They are, however, often shunned, losing valuable access to government executives.

The Creaky Leviathan by Sagar [2] is sometimes cited as a rebuttal, but I found it hugely supportive:

Let me be clear. Leaky Leviathan contains no hasty judgments. Pozen is admirably careful and measured in his praise of permissive enforcement; his broader objective is to show that the varied causes and consequences associated with leaks mean that the prevailing “system of information control defies simple normative assessment.” This conclusion I fully endorse. But the question remains: even if it advances our comprehension, does “middle-range theory” improve our ability to judge the value or worth of the prevailing “system of information control”? For the reasons outlined above, I think the answer must be no, since the overall consequences of permissive enforcement are obscure and hard to ascertain. As far as this “disorderly” system is concerned, the most we can do from a normative perspective, I think, is become clearer about whether particular outcomes it produces are more or less acceptable.

Whistle-blowing protection when it comes to national security issues is a dicey topic. The issue here is that huge losses of life and treasure are at risk. Counterbalancing such risk is that the national security infrastructure is huge: NSA, the Pentagon and all of the DoD, CIA, NSC, and of course all of the Executive Branch (and some of Congress) represent the government side, and the major contractors Boeing, L3, Halliburton, McDonnell Douglas, Raytheon, etc. are also in the mix. A system of secrecy often makes it difficult to generate healthy debate; moreover, these institutions are “wired” to worry about and exaggerate threats. Secrecy in these institutions makes it difficult even for internals to gain enough information to offer opposing views. Senator McCarthy, J. Edgar Hoover, Nixon, and others required outside forces to neutralize their negative actions. However real the communist threat was, McCarthy and Hoover violated the rights of many U.S. Citizens. There were, in the end, no weapons of mass destruction in Iraq, and we went to war over this fabrication. The Maginot line in France was an utter failure. Examples abound. The whistle-blowing that has been legitimized in the private sector is not well legitimized in the government sector. Current law limits disclosure to Congress, does not cover civilian contractors (thus Snowden is not protected; Manning was somewhat, but still got 35 years, possibly due more to the choice of WikiLeaks as an outlet). The Leaky Leviathan screams out for a legal structure to fairly protect national security whistle-blowing. Benkler’s paper [3] “Whistle Blower Defense” takes a solid crack at this.

Benkley starts with an excellent in-depth review of how we got from 9/11 to the mess that Manning and Snowden disclosed. His “Public Accountability Defense” starts with the observation that most leaks are not prosecuted, because if they were, the non-prosecuted leaks would appear to be sanctioned and would lose their credibility to shape public opinion. He focuses on “accountability leaks”, which are those that expose substantial instances of illegality or gross incompetence or error in important matters of national security. These are rare. One set occurred at the confluence of the Vietnam and Cold Wars with the anti-war and civil rights movements. The second deals with extreme post 9/11 tactics and strategies. Such leaks have played a significant role in undermining threats that the national security establishment has made to the “constitutional order of the United States” – in short, a very big deal. For many of us technologists the issues we would consider to leak would wrack our conscience and destroy our moral compass. We would be going out of our element to deal with the failure of mechanisms inside the national security system to create a leak. For example, the CIA’s program of torture, rendition, and secret off-shore prisons somehow leaked without an Ellsberg, a Manning, or a Snowden. (The technology that enabled Ellsberg to release the Pentagon Papers was a massive, highly automated, Xerox machine.) Benkley observes, “The greater the incongruity between what the national security system has developed and what public opinion is willing to accept, the greater the national security establishment’s need to prevent the public from becoming informed. The prosecutorial deviation from past practices is best explained as an expression of the mounting urgency felt inside the national security system to prevent public exposure. The defense I propose is intended to reverse that prosecutorial deviation.” Needed is a defense or at least a sentencing mitigation platform that requires only a belief that the disclosure would expose substantial “violation of law or systemic error, incompetence, or malfeasance.” This defense is based on the leaker serving a public good. It is not individual rights based. It is belief based, and does not depend on later proven illegality of what was disclosed.

This is not a naive proposal, and many subtleties are discussed, for example, to whom the leak is made, how it is made, when it is made, what is redacted, how public the leak mechanism is, etc.

Benkler reviews several historical leaks, going back to 1942 when Morton Seligman leaked decoded Navy messages to a reporter. If published, the fact that Japanese codes had been broken would be disclosed causing harm to the war effort. Since no government wrongdoing was disclosed, the accountability defense would not apply. The long historical review ends with a discussion of the Manning and Snowden cases. Manning’s 35 year sentence is obscenely excessive even though the criteria for an accountability defense are mixed. One would hope, in the presence of an accountability defense that at least a more reasonable sentence would have been handed down. A detailed analysis of the Snowden case is given; with the single exception of leaks on NSA’s Tailored Access Operations, TAO, which target specific computers, the defense applies. One interesting issue is that the legal defense should be structured so that the prosecution cannot “cherry pick” the weakest disclosure and prosecute that, ignoring the public value of the other disclosures.

The institution through which the leaks are made should also be protected by the defense. In some sense, “free press” does this, but this should be clarified in national defense cases.

Finally, “punishment by process” is discussed. The government can ruin someone in many ways. Huge legal expenses, long drawn-out trials, loss of access and jobs, etc. While protection from punishment by process is desirable, how to do this needs to be addressed. I would think that technologists fear this the most.

I strongly recommend these three thought-provoking articles.

[1] http://cdn.harvardlawreview.org/wp-content/uploads/pdfs/vol127_pozen.pdf , “The Leaky Leviathan: Why the Government Condems and Condones Unlawful Disclosures of Information”, Harvard Law Review, December 2013, Vol 127, No. 2, David Pozner

[2] http://harvardlawreview.org/2013/12/creaky-leviathan-a-comment-on-david-pozens-leaky-leviathan/ “Creaky Leviathan: A Comment on David Pozner’s Leaky Leviathan”, Harvard Law Review Forum, December 2013, Vol 127, No. 2, Rabul Sagar. [A mostly favorable review of [1]]

[3] http://benkler.org/Benkler_Whistleblowerdefense_Prepub.pdf “A Public Accountability Defense for National Security Leakers and Whistleblowers” 8(2) Harv. Rev. L. & Policy, July 2014, Yochai Benkler. [A well reasoned and historically justified proposal for the legal structure of a whistle blower defense that is of particular interest to technologists.]

Removing Crapware from Windows

2015/05/28

Every so often my PC starts getting slow. In the task manager there are dozens of processes that I don’t recognize. It’s a real pain to clean these out. But, …, I guess this is just basic maintenance that needs to be done. Here are my notes for today. I doubt this makes good reading, unless you land here via a search engine and want to see how I got rid of something.

The first lesson here is that removing crap is best done in the Administrator account, and not just in an ID with administrator privileges. Some utilities (sc for example) test for user ID and not just privileges. If you use Windows Vista, 7, or 8, this account is “hidden”. Sigh. If you’ve ever wondered what the option “run as Administrator” is, now you need it.

On the site windowsvc.com, I found this helpful way to remove crap installed as a service. In this case, I wanted to remove BrsHelper:

Open a command prompt by right clicking its icon and selecting “run as Administrator”. Copy the lines in red respectively to stop, disable auto-start, and to delete the service entirely. For example,

sc stop “BrsHelper”

sc config “BrsHelper” start=disabled

sc delete “BrsHelper”

I note on the web that others get “Access Denied” with sc even when running it as Administrator. I didn’t have that problem, but beware. This seems like a nice utility. It does have a side effect of staying in memory after using it. I had to kill its process tree from the task manager when I was done with it.

The Administrator account isn’ t just hidden, it isn’t enabled at all. To enable it, run the command prompt as Administrator as above, then type:

net user administrator /active:yes

Now the Administrator account is active, and you’ll see it when you want to log in or just change user accounts. BEWARE, initially it has no password. Be sure to set a good one if you want to leave it active. To disable it, repeat the above command with “no” instead of “yes”.

There are other ways to do this. Vishal Gupta’s site www.askvg.com offers three other ways here.

I was trying to remove the crapware YTdownloader, and ran into the above Administrator problem. There is an interesting utility autoruns.exe which lists all of the programs that are set to auto run. You must run this program as Administrator, but you can tune the autoruns without messing directly with the registry. You can also submit whatever you find to VirusTotal. My local McAfee claims there is a trojan inside YTdownloader.exe. There are other reports that it is malware. My early attempts to remove it got trapped by McAfee which claimed that the program was moved to a quarantine area. But going to McAfee’s interface for its quarantined files showed no sign of YTdownloader. I could find it using the file explorer, and there was a directory of the same name, which I could delete but only as Administrator. This didn’t get rid of a companion program BrsHelper, which I killed as above.

Incidentally, YTdownloader is sometimes called YouTube downloader. Beware of being tricked into installing YTdownloader by trying to download videos! I don’t understand the relationship here.

I also got rid of a couple Dell programs with bad reputations: dkab1err.exe (the character after the “b” is the digit one.) and DKADGmon.exe. They must have gotten installed when I used a Dell printer at one of my consulting client’s sites. With Administrator active, I had no trouble deleting them. I did have to deal with an extra prompt to continue however. Just click it and move on.www-searching.com

The program biomonitor.exe was always running. The utility autoruns.exe didn’t list it. Apparently it is part of HP’s SimplePass fingerprinting tool. To delete it, kill the process tree for biomonitor from the task manager, and then uninstall HP SimplePass from the control panel.

I came across a program WindowexeAllkiller.exe. While it looked interesting, it required the .Net framework, thus I didn’t try it. CNET warns that while safe, an inexperienced user can get into trouble. The author recommends checkpointing Windows before using it. The apparent goodness of this tool is that you can eliminate several bad programs at once. I suppose this is why it is such a dangerous tool. Some feedback on this tool would be welcome.

As I was thinking I was done, I noticed an unexpected tab in Chrome for www-searching.com. (Note the hyphen.) I don’t know how it got there. As I was on a roll looking for strangeness, I quickly found that this program was a search engine of sorts that was designed to track you and steal your personal information. The only damage it did to me was to install a shortcut to its site on my task bar. Of course I deleted the task bar item and the tab in Chrome, and then I did all the due diligence to get rid of potential infection elsewhere. I searched the registry, checked for Chrome add-ons and for a hijacked home page, checked the Chrome history and was very surprised to find nothing, checked the scheduled tasks, searched the file system, and looked for ads by it. I couldn’t find anything else. Malwarebytes was reputed to find and remove it, but a complete scan found nothing. Maybe I was lucky that I didn’t try out this bogus search engine!

I noticed on the web that www-searching.com was also similar to ohtgnoenriga.com (Gads, what language is that?) as well as search.conduit.com “Conduit Search”. I also looked for ohtgnoenriga and conduit.com on my system, and fortunately found nothing.

Finally, I deactivated my Administrator account as above.

Comments on NIST Interagency Report 7977, Draft 2

2015/03/24

To: crypto-review@nist.gov

Cc: crypto@nist.gov

Re: NIST Interagency Report NISTIR 7977 “NIST Cryptographic Standards and Guidelines Development Process” January 2015 Draft.

Date: March 24, 2015

The January 2015 Draft is a big improvement. In addition, the Summary of Public Comments on NISTIR 7977 (Feb 2014 (First) Draft) was quite helpful to compare how the current draft addresses comments on the first draft. The authors of both new documents are to be commended for their work.

General review comments:

A. Review comments on the first draft of NISTIR 7977 reflect a distrust of the NSA, and by extension of NIST processes. Suppose China sends NIST an email that says, “Hey, we’ve been studying elliptic curve E over field Z/p with base point Q. It has some neat properties, and China recommends that NIST adopt E for its elliptic curve encryption.” Should NIST consider China’s suggestion? A good test of the processes in NISTIR 7977, would be to answer “yes”, even though we suspect that China knows a backdoor to encryption that uses E. Now, receiving such a suggestion from the NSA should be little different. Even though NIST is required to consult with the NSA, most independent security consultants, post-Snowden, would not trust a curve suggested by the NSA any more than one suggested by China. I certainly would not. Therefore we look for processes in NISTIR 7977 that politely look with great suspicion on suggestions for tools and algorithms from the NSA. NIST should get expertise and knowledge from the NSA, but should not blindly accept its tools and algorithms. NIST processes for analysis and acceptance of any suggestion, be it from the NSA, from China, or from any other source should be equally stringent. In particular, cryptographic technology should not be standardized without significant consensus from a wide swath of independent cryptographers. The current draft of NISTIR 7977 does not have an emphasis on public analysis and consensus for acceptance.

B. Of course, NIST could standardize, say an encryption method, even from a private and independent source, and the NSA, China’s PLA Unit 1398, the UK’s GCHQ, or other nation-state cryptography group could know how to crack this encryption method, remaining silent during and after the standardization process. One would hope that NIST, through its own efforts and its financial support of academic and other cryptography research would facilitate the discovery of the weakness and would quickly retire the standard. Such post-standardization life cycle efforts by NIST also need to be part of NISTIR 7977 [line 727].

C. Now if NIST were to publish a proof that a tool or algorithm received from China, the NSA, or another source in fact had no back doors and was totally secure then a consensus on standardization might well be easily achieved. After believing such a proof, I might recommend the proven technology to my clients, but I probably would wait until a huge number of non-government cryptographers also believed the proof and also were recommending this technology. NISTIR 7977 says roughly that NIST will “pursue” to find and use proofs [line 55]. I’d be happy if NIST worked with the NSA and other agencies on such proofs and would recommend such efforts be part of NISTIR 7977.

D. NIST publishes security papers at a prodigious rate. So fast that reviews are deemed inadequate. In light of post-Snowden caution around NIST processes, people naturally ask if these poorly reviewed papers can be trusted. It isn’t going to help if NIST says, “It’s ok, the NSA has reviewed it…” Look, not only does the current draft of NISTIR 7977 fail to convince that future NIST papers will receive good independent reviews, there was no indication that past NIST papers will retroactively receive good reviews. This is a very sad state of affairs, but it is fixable.

Some more specific review comments:

  1. Clarity of the NIST Mission: To develop strong cryptographic standards and guidelines for meeting U.S. federal agency non-national security and commerce needs. This mission should be parsed: To develop strong cryptographic standards and guidelines for meeting 1. U.S. federal agency non-national security needs and 2. commerce needs. My point is that the needs of commerce should treated by NIST as equal to the needs of any federal agency. [line 202 Balance]. For example, federal agencies may well be happy with NSA algorithms, but general commerce may not be.
  2. I do not agree with the NIST Response (page 7 of Summary) to Technical Merit comments that NIST should give priority to non-national security federal information systems. NIST should always make commerce needs equally important. Such a priority statement doesn’t seem to be in NISTIR 7977 explicitly, but there are several statements about NIST being legally required to give such a preference when so ordered by a government entity.
  3. NIST’s statement that it will “never knowingly misrepresent or conceal security properties” [Summary page 3; line 215 Integrity] reminds me of Barry Bond’s statement that he “never knowingly took growth steroids” when his hat/head size at the end of his career was three sizes larger than when he was a rookie. I would prefer a more proactive statement such as “NIST will make every reasonable effort to ensure that military, intelligence and law enforcement agencies by their suggestions, review comments, or contributions do not compromise any security tool or algorithm recommended by NIST.” For NIST standards to regain the confidence of the security and general commerce communities, NIST processes should convincingly ensure by NIST public actions that its tools and algorithms do not compromise the privacy or the integrity of any commercial or private message being protected by NIST standards.
  4. The FISMA requirement that NIST consult with certain federal agencies including the NSA to avoid duplication of effort and to maintain synergy of federal information protection efforts, but NIST can never in the future blindly accept a recommendation from any public or private agency. What is important is that NIST actively regain and maintain its process integrity via the new NISTIR 7977. The current draft falls short.
  5. NIST should consider resolving the conundrum of needing NIST output frequently and needing adequate public reviews of such output by the creation of additional outside paid review boards and conformance testing bodies. Such a review board should be established to review annually the entire Cryptographic Technology Group. [lines 316 and 326]
  6. Minutes of the monthly NIST/NSA meetings should be published. [line 377]
  7. Independent review boards should have the power to reject a proposed standard, say if NIST could not convince the board that the NSA or another agency has not compromised the standard. [page 4; lines 47, 403, and 464]
  8. The NISTIR 7977 Development Process itself should undergo regular review and updates at, say, an annual frequency.

Requested Comments [line 125]:

NIST Question Comment

Do the expanded and revised principles state appropriate drivers and conditions for NIST’s efforts related to cryptographic standards and guidelines?

Yes, but if the word “appropriate” were replaced by “adequate” then No. Neither the integrity of NIST processes in face of NSA influence, nor the issue of adequate review are satisfactorily answered. [A, C, D]

Do the revised processes for engaging the cryptographic community provide the necessary inclusivity, transparency and balance to develop strong, trustworthy standards? Are they worded clearly and appropriately? Are there other processes that NIST should consider?

No. “Trustworthy” standards need public confidence that the NSA or another agency have not added or know of backdoors or other weaknesses to their contributions. Wording isn’t an issue. Different new processes are necessary to separate NIST from NSA and other related agency influence. After-standardization research efforts should be funded as part of all life cycles [B].

Do these processes include appropriate mechanisms to ensure that proposed standards and guidelines are reviewed thoroughly and that the views of interested parties are provided to and considered by NIST? Are there other mechanisms NIST should consider?

No. Cf. A, C, 2, 3, 4, and 7 above. Regarding 7, if NIST won’t vest veto power to independent reviewers, such experts will tend to not participate. Lack of review resources also seems to be a problem. Cf. 5 above.

Are there other channels or mechanisms that NIST should consider in order to communicate most effectively with its stakeholders?

Yes. More paid outside reviewers including an annual review of the Cryptographic Technology Group. Cf. D and 5 above.

Respectfully submitted,

Gayn B. Winters, Ph.D.

Technology Consultant

Superfish

2015/02/24

I don’t own a Lenovo PC, thus I shouldn’t get upset over Lenovo pre-installing adware from Superfish. This adware was apparently infected by a third-party Komodia (Israel startup) which put a bad certificate on the Lenovo PC which allowed a Man-in-the-middle attack for all web sites. The web site’s SSL certificate showed the issuer was Superfish. Now I hate preinstalled software and routinely delete it when I (or a friend) get a new PC. The annoying thing here is that just removing Superfish doesn’t remove the bad certificate and the MITM exploit can continue. Lenovo has apologized and has a removal tool (from McAfee, but other vendors have one as well.) Lenovo has been hit with a class action suit, which I hope will extend to Superfish and to Komodia.

I don’t particularly like government intervention, but if it were clear that Lenovo and its suppliers were guilty of some federal crime and subject to huge fines, it might dissuade PC makers from preinstalling such crap onto their PCs. I know that PC profit margins are thin, and preinstalled software adds revenue, but really! (There is also the fallacy that preinstalled software enhances the PC by making it usable and attractive right out of the box. If you think this is attractive, think about the Superfish infection!

Some people like to reinstall Windows – assuming they have an installation disk from Microsoft. (This is less hassle for open source operating systems, because such a disk image can be downloaded.) In this way, all the crap installed by the PC manufacturer doesn’t get installed.

Keyloggers as Trojan Horses

2015/01/15

Read this Register article for starters.

A $10 giveaway at, say, conferences, could be an effective Trojan Horse.  Samy Kamkar (@samykamkar) has released schematics for a key logger built on the open source Ardunio hobby board.  I don’t quite see how to get the cost down to the claimed $10 without a lot of volume, since the Ardunio board retails for $25.  Perhaps the NSA has some better secret manufacturing plans with more nefarious delivery ideas.

Samy of course hopes that Microsoft and other keyboard manufactures will address such a security hole.

2015

2015/01/01

Every few years I put together a talk about technology expected to develop or become prominent in the New Year. As I contemplated such a talk for 2015, my thoughts were stuck on cyber-security. While 2014 Silicon Valley IPOs in storage and in health-care are astounding, my thoughts are still on the vast discrepancy between the sophistication of malware attacks and the woeful inadequacy of corporate defenses. Various estimates of cyber-theft losses run into the hundreds of billions of dollars. These losses are hard to quantify. Even a kiddie virus that “only” disrupts a local network can cost millions of dollars in repairs and lost revenue. Never quantified by the courts, how do you put a numerical value on opportunity cost? How does one value the careers of the Target CEO and CIO who lost their jobs? Imagine the settlement if these two people alone could sue the perpetrators of the November 2013 Target breach in US court!

It is disheartening to contemplate that both the November 2013 Target breach and the recent Sony breach were preceded by successful, but smaller, earlier breaches. They of course were also preceded by other breaches into other companies. Will 2015 be the year that people wake up? Yes and no.

Let’s first consider the retail industry. We’ve had breaches at Target, Home Depot, Neiman Marcus, Sally Beauty, Kmart, Dairy Queen, Michaels Stores, P.G. Chang’s, Heartland Payment Systems, Goodwill, Supervalu, Staples, Jimmy John’s, Bebe Stores, Sheplers (western wear), Chick-Fil-A, OneStopParking (Krebs claims same attackers as Target’s), and probably many others in the retail industry that I haven’t studied. If this isn’t enough to motivate CEOs of retail companies, consider the very public breaches outside retail: Google (exposed Chinese Gmail accounts), Epsilon, Sony (Playstation), Sony Entertainment (Movies), US Dept of Veterans in 2009, Global Payments 2014, AOL, eBay, JPMorgan Chase, Adobe, United Parcel Service (UPS) Stores, Sands, etc.

No longer can the CEO of even a modestly large retail outlet assume it will be the “other guy” whose credit card database gets attacked. The board room topic of how much to increase the IT budget to address security will come up. The answer will be something like 10%. This is so wrong for many reasons. What most IT departments need is a total cultural change: new people, new expertise, new software, new security products, new processes, and new influence that will affect the entire company. This doesn’t even count the pain that Microsoft is forcing companies to suffer by shutting down support for older Windows products, notably XT and Server 2003. My guess is that the correct board room answer should be 100-200% (and even higher in capital costs for things like pin and chip support) and not a paltry increase such as 10%. If those triple digit percentage increases are even floated, they will get shouted down as not affordable.

Affordability here isn’t a technology topic, it is a topic for the Harvard Business Review: Restructuring the Retail Industry. I’ve seen hints of this. Authors scratch the surface on topics like “Who Pays?” for a breach? “How much should one spend of security?” where authors look at the probability and severity of a loss for a retail company and come up with some low recommendation. Bruce Schneier has in 2014 given a couple insightful talks which I cynically interpret as saying “Look, for all the reasons that I’ve just explained to you, you’re going to get hacked, so put your money on Incident Response. Namely, invest in recovering from the inevitable attack.” Bruce’s company Co3 Systems sells incident response products and services. To be fair, Bruce doesn’t say not to invest in malware defense, but rather, don’t fail to invest in incident response.

My first 2015 predictions: Large retail companies will not restructure, but they will wrestle with this affordability problem. Security product and consulting companies will do very well as a result. Incident response companies should also do well. Malware defense products will improve; however, retail companies will continue to be hacked. The attacks will escalate and increase in sophistication. Damage will continue to rise. The recent Sony Entertainment attack shows that retail won’t be the only target (no pun intended.)

OK, what really scares me? It isn’t retail! If we admonish retail and related companies for ignoring early warning signs of malware attacks, aren’t we blissfully ignorant of the warning signs for infrastructure attacks? We are. In fact, most of the cyber-security articles that I read also ignore this.

My second 2015 predictions: The United States will suffer a cyber-attack on some infrastructure site in 2015. The technology of Stuxnet and its predecessors and follow-ons Duqu, Flame, Gauss, Wiper, Mahdi, Shamoon, sKyWIper, Miniduke, Teamspy, etc. provide a roadmap for even the smallest nation-states to follow for such infrastructure attacks. In fact, if you take Ralph Langner’s excellent paper “Stuxnet’s Evil Twin” and substitute “centrifuge” for your favorite industrial mechanism, the result isn’t a bad outline for how to proceed with such an attack. An actual attack, say of our electrical grid, would have to modify multiple SCADA systems and multiple flow control and transmission devices. Lot’s of code needs to be modified from the Stuxnet code, but a small nation state could do it. (North Korea is reported to have 1800 skilled software engineers engaging in cyber-espionage. Such a large team and their contractors could do it.) Perhaps a smaller team of educated terrorists could carry out a more focused attack, say of a single industrial site.

We are not without early warnings beyond Stuxnet itself. SiliconANGLE’s May 2014 article on Iran and Syria attempted attacks on US Energy firms is such a warning. Certainly the attacks on Iran’s and Saudi Arabia’s oil infrastructure are warnings. A gcn article on an attack on Iran is here. A cnet article on a Quatari LNG attack and a Saudi Aramco oil company attack is here. This cnet article outlines multiple variants of the malware potentially used in these attacks. The February 2014 RSA Conference had multiple, more technical, talks on this topic. I expect even more for the 2015 conferences in the US and in Asia Pacific & Japan.

What totally surprises and scares me is that the U.S. has not yet had a serious infrastructure cyber-attack, while mid-eastern countries have had such attacks. Advancing past Stuxnet, the attack software is becoming more sophisticated and powerful. The U.S. is due…

Sands and Adelson – a secret cyber attack prior to Sony

2014/12/24

In October 2013, the ultra-conservative Sheldon Adelson spoke at the Yeshiva University Manhattan campus and called for bombing Iran.  Now Adelson, the 22nd richest person in the world, owns 50+ percent of the Las Vegas Sands Corp (LVS), which owns the Sands, the Venetian, and other such properties around the world.  On Feb 10, 2014 LVS was under a massive cyber attack, which LVS and the US Government kept secret until recently.  This December 11, 2014 Bloomberg published this 5 part report.  It’s a great read!

The Cuckoo’s Egg – Revisited

2014/12/17

The other day I picked up a used copy of Cliff Stoll’s book The Cuckoo’s Egg about his search for a hacker, ultimately identified as a German, Markus Hess. Hess was using Stoll’s Lawrence Berkeley Labs computer as a base to infiltrate various government computers to steal (and sell to Russia’s KGB) government documents.

In this age of “Advanced Persistent Threats”, this 1986-8 threat was hardly “advanced”. In fact Hess’ basic break-in approach consisted of trying simple passwords for known system and vendor accounts. Today, this still an effective break-in approach! People are too lazy to create complex, but easy to remember, passwords.

Hess also used known bugs in system programs to escalate his privileges. Today, I get weekly CERT notifications of such bugs. There are hundreds of them announced annually. Nothing new (or advanced) here!

What did impress me about this story was that Hess was amazingly persistent. His efforts spanned many months. He was careful – always checking to see if some system person could be watching, and if so, quickly logging off. When a cracked password was changed for example, Hess quickly moved to another system and kept his attack going. “Persistent” threats aren’t new.

Hess copied password files to his system, presumably for off-line brute force (albeit simple) dictionary attacks to “guess” passwords. Some of these attacks were successful.

Also impressive was that Stoll set up an automated warning system to track Hess’ intrusions. It was not visible by Hess, but it automatically recorded his keystrokes on Stoll’s computer. Its design made it impossible for an intruder to delete or modify its records. It was an early threat detection system that of course was primitive compared to today’s detection systems, but it was instrumental to the discovery of Hess and Hess’ cohorts. Stoll also manually created a log notebook, which I would still recommend in the analysis of any attack. Such a notebook would include all aspects of an investigation, including interactions with network vendors, government agencies, and interested parties. Stoll’s astronomy training included “If you don’t record it, it didn’t happen…” – a good message for today’s network forensic engineers.

Another feature of Stoll’s detection system was the creation of what we today call a “honeypot”. His was rather simple: just some fake, but apparently interesting documents that needed system privileges to read. Stoll left open his computer so that the attacker could be tracked, but some government computers were forced to clamp down immediately. I’ve seen companies today, Google comes to mind, where systems are left vulnerable to track intruders so long as damage can be contained and not affect customers. Leaving a system and a honeypot open for the analysis of a threat is a good technique.

I found it hilarious that in the course of watching Hess attack various government and government contractor systems, Stoll was told by the owners of these systems, “It’s impossible to break into our system; we run a secure site.” I’m reminded of all the retail vendor breaches occurring these days as well as the stuxnet-like attacks.

Finally, Stoll had trouble getting help from the FBI, the CIA, and the NSA. The FBI has certainly beefed up its computer expertise since 1988, but it still will refuse to help anyone to deal with an annoying hacker that does not cause serious financial damage. My recommendation here is to pre-prepare an argument for why a threat can potentially cost your company lots of money. Homeland Security has a bevy of agencies to combat cybercrime; learn about them here.

References

  1. Stoll published a May 1988 ACM article “Stalking the Wily Hacker” that outlines chasing Hess. His book is better and is also an easy and quick read.
  2. The Cuckoo’s Egg, Doubleday 1989
  3. TaoSecurity’s Richard Bejtlich’s excellent talk on chasing Hess (has good photos).
  4. “The KGB, the Computer, and Me”. Video that tells Stoll’s story.

Six Windows Utilities – nice article

2012/12/05

Lloyd Case wrote a nice PC World article “Six awesome built-in Windows utilities no one knows about” dated Dec 4, 2012 3:30 AM.  Here is a link .  The word “awesome” is an exaggeration, but it is good to have them pointed out to us non-Windows experts.  Five of the six utilities work in Windows 7, and perhaps they work better in Windows 8 (I only have W7.) They are:

  1. Reliability report: type “reliability” in the search/command window.
  2. Direct X Diagnostics: type “dxdiag” in the search/command window.
  3. Display Calibration: type “calibrate” in the search/command window.  Works with two displays. Some video cards come with a similar utility.
  4. Record Steps: type “steps” in the search/command window.  Click on the record button in the pop-up window.  Article recommends a more sophisticated capture tool, Camtasia, but I haven’t tried that.  This works ok.  It creates a zip file that you can save, open, and edit.
  5. Schedule jobs: type “schedule” in the search/command window. The resulting window gives lots of options with which to submit batch or scheduled jobs.
  6. Hyper-V.  Needs to be turned on in the Control Panel, but my W7 OS did not have it.  The idea would be to run virtual machines and your choice of OS on the VM.

This article also has a lot of negative reader comments on Windows 8.  I haven’t tried W8, but my recommendation is always to stay with the older OS as long as possible, upgrading it with Service Packs when available.

 

-gayn