| This week's sponsor is CA Technologies. |  | Webinar: Rethinking Enterprise Mobility Management – Beyond BYOD Thursday, May 29th, 12pm ET / 9am PT Our panel of experts will help you understand how to develop effective strategies that accelerate mobility transformation and prepare your organization for the mobile future. Register Today! | Editor's Corner: A guide to life in the post-era era Also Noted: PGi Spotlight On... A workable, device-agnostic comeback plan for BlackBerry A 50-year-old platform is commemorated; A 5-week-old platform is eviscerated; and much more... Follow @fierceentcomm on Twitter News From the Fierce Network: 1. Beyond Verbal's emotion-recognition tech could change customer service 2. Say hello to the CMTO 3. Don't confuse digital business with IT, Gartner warns | This week's sponsor is Mashery. |  | Delight & Engage Customer with Mobile APIs Read this success story and learn how a robust API and secure API Management powered Keep's iOS app to become one of the most popular apps in the Lifestyle category in the iTunes App Store. Read now! | |  A guide to life in the post-era era I once wrote that the history of personal computing could be divided into eras, and that it appeared thus far that "eras" lasted two years apiece. It was the kind of sweeping statement that set the theme and the tone for a publication that was soon to become the best-selling newsstand magazine in America. No, it wasn't Playboy. It's a statement that an ambitious, self-assured amateur makes to establish himself as experienced beyond his years, which is perhaps easier to accomplish when one is 23. Since that day in 1988, I've lost track of how many eras there have actually been in computing, and I've forgotten several of their names. Upon the scrolling annals of many a blog, I've demonstrated myself before audiences of dozens to be inexperienced beyond my years. Nevertheless, I'm mindful of the passage of days and weeks--more so today than ever before. And I can accurately report that the period of time between being besieged with the latest wave of emails from PR folks declaring the dawn of the "post-PC era," and Microsoft CEO Satya Nadella declaring the dawn of the "post-post PC era" is two weeks. Two years ago (perhaps in a previous era), Microsoft hired as its Distinguished Technical Evangelist a man named James Whittaker. In a company blog post presenting four steps to giving a presentation that changes the way people think, Whittaker's rule #3 is "Make It Epic." The problem with making everything epic is that, eventually, nothing is epic. It seems fitting that one of the pictures in Whittaker's blog post is of Albert Einstein, the first person to successfully attempt to equate subatomic physics with common sense. There comes a period of time, after the hundredth or so repetition of the dawn of a new era, when you stop believing the passage of time is marked by product milestones. You begin to see that, in a universe where everything is equivalently relative, nothing is all that distinguished. It's like picking at random any hundred movie trailers produced in the last decade. A vast majority of them have the same screeching, rising pitch of the string section, as though it were impersonating a vacuum cleaner, coupled with the thunder of taiko drums. On screen is depicted the end of the post-apocalyptic era, where some colossal machine run amok by tyrannically misguided corporate interests is brought down by a speech by Tom Cruise. In a world... one man... will herald a new era. Maya Angelou died this week. Already her passing is being called the end of an era. She has been called a "literary legend," which would probably make her laugh in that most potent and refreshing of laughs she had, knowing that all things literary are, by definition, legends. Maya herself has said that the act of dying is not a passage but a liberation. I grew up with poetry, and I have loved Maya's voice giving life to her words the same way I've loved Itzhak Perlman giving life to notes, Bertrand Russell giving life to logic and Albert Einstein giving life to reason. Truth is always truth; it requires no eras. It is timeless, though our appreciation of it may always be renewed like the dawn of a new day. Truth does not require the epic thunder of taiko drums or vacuum cleaners. It doesn't need fanfare to remind us of its pertinence. Stage presenters require "eras" to create mental pictures of the scenes in which they spin their ideas. But like five-year-olds in the back seat attempting landscape drawings on their Etch-a-Sketches, when their ideas change--as ideas must do, to survive--they erase their background scenes and start over from scratch. The dawn of a new era is the erasure of precedent, a move to make you believe what's past is no longer prologue, even when the past is but two weeks ago. Poetry is the greatest conveyor of truth in language. If an ideal aspires to be truthful--even a technological ideal like security or privacy or low latency or high resolution--then it transcends time, like poets for whom even leaving this world is but the opening of the next doorway forward. [Today's column is dedicated to the memory of the author's favorite era: his grandmother, Era Ruby Skinner Pool.] Read more about: Satya Nadella back to top | | Today's Top News 1. Appeals court upends ruling on rural broadband fund Outside the Federal Communications Commission, there's widespread objection to proposed revisions to its Open Internet regulations, allowing for "commercially reasonable" agreements between service providers--agreements that opponents fear would enable ISPs to charge premiums to certain content providers, or even to consumers. The single alternative thus far is for the FCC to resume regulating the Internet like a public utility. In its Verizon decision earlier this year, the D.C. Appeals Court let the FCC get its foot in the door for Title II regulation under the 1996 Telecommunications Act. Last Friday, the Tenth Circuit Appeals Court in Denver may have pried that door open the rest of the way. The Tenth Circuit's decision affirms the FCC's right to repurpose its Connect America Fund, or Universal Service Fund geared toward expanding telecommunications service to rural areas, for expanding rural broadband service as well. A list of service providers over seven pages long stood opposed to this move on the grounds that the FCC funding broadband expansion exceeded its statutory authority. In their argument, the telcos first pointed to the key weakness in the FCC's position, made evident in its defeat to Comcast in 2010: If the Internet truly is an information service, as the FCC itself began defining it, then it falls outside the FCC's own regulatory purview. Second, even though the FCC promised to only fund projects in regions of the U.S. where telcos have not yet established competitive presences, USF would benefit certain companies that were not telcos themselves. But in their decision Friday, the three-judge panel, literally split the '96 Act into shards, saying that any interpretation of the limits of the FCC's authority to regulate Internet service under Sec. 706(a) should not necessarily apply to Sec. 706(b). Specifically, it's 706(b) that grants the FCC the authority to raise funds to deploy "advanced telecommunications capability... to all Americans in a reasonable and timely fashion," and to take immediate action where necessary to accelerate deployment. Sec. 706(a) grants the FCC similar authority over "telecommunications services," but not "advanced telecommunications capability." Did Congress mean for these two phrases to mean distinct and separate things? In lieu of a time machine with which we could go ask Congress, legal precedent says that judges can't just come up with an answer on their own. Instead, they must apply what has come to be called "Chevron deference," having nothing to do with the oil company except for the fact that this precedent was set in a case involving Chevron. Essentially, this precedent means that when a law passed by Congress is ambiguous in its meaning, whether unintentionally or by virtue of changing circumstances through history, about the authority it grants any regulatory agency, judges must defer to how that agency interprets the law for itself, so long as it appears reasonable. The order of proposed rulemaking for USF explicitly stated that the FCC did not believe the issue of whether VoIP services were telecommunications or information services, impeded its own ability to institute this fund. In addition, the FCC's attorneys had argued that the agency interpreted the distinction between the two phrases in sections 706(a) and 706(b) as intentional. On that basis, the Denver court found the FCC's interpretation reasonable, and threw out the telcos' case. If the Verizon decision gave the FCC a path for re-adopting Title II-style regulation if all else fails, last week's Tenth Circuit decision laid down asphalt and road signs. Related Articles: Another side of net neutrality: The case in favor of Title II Embracing VoIP against a regulatory minefield Sprint partnership suddenly makes rural LTE a real market Read more about: universal service back to top | | This week's sponsor is Meru Networks. |  | Download the White Paper "802.11ac in the Enterprise: Technologies and Strategies" to learn from industry expert Craig Mathias about the technologies behind 802.11ac, deployment misconceptions and review steps that every organization should take in getting ready for 802.11ac. Click here to download. | 2. Future Adobe Reader will include DRM for e-signatures An upcoming release of the Adobe Reader client for reading PDF documents on PCs and mobile devices will contain extended use of digital rights management technology--this time put to use in enforcing per-user access policies. Such policies will make it possible, at some point over the next year, for admins to use a cloud-based access management console to control access to digitally signed documents. This news comes from Adobe's EchoSign unit in an interview Wednesday with FierceEnterpriseCommunications. "How do you protect something that has been signed, and make sure that you trust it, after the signing experience?" asks Jon Perera, vice president and general manager of Adobe's EchoSign unit. Perera is asking rhetorically, of course. "One of the most interesting technologies we'll use there is digital rights management," Perera tells FEC. "This really is something that Adobe can bring to bear, because Adobe's been in the DRM business for 15 years. EchoSign is using that technology so that, after a contract has been signed, a company can apply policy to that contract and ensure that only the right people open it." Perera points out that a signed contract, once pilfered by an outside source, can still be posted to WikiLeaks, uploaded to an unsecured public storage location like Dropbox, or copied to a thumb drive. An undeniably significant opportunity is lost there for the same technology that authenticates the signature, to effectively secure access to the document. When Adobe acquired digital document signature provider EchoSign in 2011, the new parent company promised to make EchoSign for signatures as ubiquitous and commonplace as Flash for animations and PDF for documents. A number of underestimated barriers ended up postponing that goal, foremost among them being the cultural barrier. EchoSign recently surveyed 351 non-customers--enterprises from medium- to large-scale that do not use digital signing technology from any vendor. Some 60 percent of those questioned were not even aware of the product category--they didn't know the technology existed. The remainder reported a lack of urgency for adopting e-signing, and a sizable plurality was under the impression that electronic signatures were not considered legally valid. The company could find these customers quite handily because, by its own estimate, the technology has only pervaded 15 percent of enterprises. That's literally no change from levels garnered from surveys conducted at the turn of the century. Adobe hopes to solve this problem by continuing to build EchoSign into a cloud-based platform for authenticating e-signatures that can itself be leveraged as an authentication mechanism for document access control--and conceivably other purposes as well. But in a move that may re-ignite skepticism about Adobe's intentions--dating back to 2007, when it first submitted PDF to an international standards body--EchoSign plans to deploy its service in such a way that competing PDF reader applications such as Nitro Pro will not be able to vouch for signatures' authenticity... at least not right away. And that may end up making professional PDF documents not so portable. "If the user tries to open up a contract that has been signed and protected with Adobe's DRM technology, it won't open in any other client than Adobe Reader," states Perera. "And 99 percent of the time, the customer already has Adobe Reader on their laptop or phone or tablet. Reader goes to 'dial home' to make sure that user has permission to open that [document] up. If you try to open it up with any other reader, it won't work. This means that we've got a pretty elegant and low-cost way for hundreds of millions of users and companies around the planet to immediately take advantage of this technology, without having to install anything on the client side." Perera tells us his division is exploring the creation of APIs that will enable developers outside of EchoSign to address the signing mechanism, which may perhaps alleviate any document portability issues that may arise. He noted that some 30 percent of accesses to EchoSign's current cloud service come from API calls, and expects that number to rise to 75 percent in two years' time. Related Articles: NitroPDF goes after Adobe with latest release [FierceContentManagement] Mobile workers frustrated working with documents, finds IDC [FierceMobileIT] Read more about: Adobe Reader back to top | 3. Survey: Most development teams don't inventory their open source Over the last decade, there has been no significant change in how software development teams handle the incorporation and dissemination of open source components in their software and services. This based on information released six years ago, more information released two years ago and new survey information being released this week by the proprietor of a cloud-based repository service called WhiteSource. Granted, vendor-driven surveys have a tendency to corroborate whatever business need the vendor is striving to address with its products. But in this case, WhiteSource is corroborating a plainly obvious situation, made more evident in recent months by bewilderment over the scale of the OpenSSL vulnerability: Teams that use open source components in their software and services may not be doing so in compliance with those components' respective licenses, and in many cases may not even be aware they're doing so at all. Some 53 percent of the 120 software development organizations that WhiteSource surveyed reported they do not keep any inventory of the open source components they use. While 29 percent of respondents do produce an inventory list every few months or so, another 18 percent were willing to admit they typically come to realize they're using something from the open source community the first time it shows up in their code. "For the most part, open source is mismanaged by software vendors large and small," said Rami Sass, WhiteSource's CEO, in a webinar on Wednesday. "Most companies still rely on resource-intensive manual processes performed by R&D, developers or DevOps guys. This results in costs that are relatively hidden, and exposes the company to a wide range of security vulnerabilities which, in turn, can cause even greater internal costs or open the company up to various risks." The key difference between the situation then and today may be the introduction of risk managers. Often affiliated with insurance companies, the financial heads of enterprises are utilizing risk managers more and more, not just to reduce insurance costs but to identify and ameliorate latent risks. OpenSSL was certainly a latent risk, and even now developers may be utilizing older, unpatched versions of the encryption scheme in their software. Risk managers may not be developers themselves, but they know how to nag development teams into cooperating with them. One tool at risk managers' disposal is the estimate of wasted costs, due to the time developers may be spending manually evaluating their OSS inventories, or on account of patching their software to comply with licensing terms after failing to take inventory. This may be the leverage point that gets WhiteSource's service noticed, particularly among the 75 percent of development organizations that, according to the company's survey, do not have any policies regarding how to license and use the open source contributions of others. In a demo, Sass showed how his company's platform can not only produce an inventory of OSS components actively in use, but also chart the dependencies those components have upon other OSS components. WhiteSource also produces a real-time analysis of vulnerability reports on those components, gleaned from official sources as well as throughout the open source community. For more: - read coverage on tracing open source components from 2008 - coverage from two years ago on ReadWrite - see the WhiteSource site Related Articles: OpenSSL finance chief: Funding unrealistic for mitigating the next Heartbleed Linux Foundation enlists Microsoft, Cisco, Facebook to help save OpenSSL A post-Heartbleed debate on whether open source has failed Read more about: OpenSSL back to top | 4. Title II advocate points to 1956 precedent for cloud Cloud computing is typically presented in the context of pages such as these--as a completely new and often foreign phenomenon. It alters the complexion of data centers and converts software into services. One very seldom heard argument is that cloud computing--effectively the merger between information and telecommunications systems on a colossal scale--could not have come into existence without an effective legal precedent. The date of that precedent is January 24, 1956, says former congressional staffer and former Capitol Hill telco industry advocate Earl Comstock, now a practicing attorney with Eckert Seamans Cherin & Mellott, LLC, in Washington, D.C. It was on this date that the original AT&T Corp. signed its first consent decree with the U.S. Dept. of Justice (the second would come in 1982) enabling it to keep its monopoly over the nation's telephone system so long as it enjoined itself from entering into any other industry. In Part 1 of FierceEnterpriseCommunications' interview with Comstock last week, he made a case for the FCC ending its Open Internet debate now, and resuming its prior policy of regulating the Internet somewhat like a telecommunications service--as it had been doing prior to Chairman Michael Powell's first articulation of net neutrality principles. Now, Comstock tells us, nearly six decades ago, AT&T floated the notion that large-scale computers would require telecommunications systems--and therefore long lines--to share processing power and data storage. It was purchasing computers for this purpose, and wanted to recoup some of its costs. Even back then, AT&T foresaw the possibility of selling excess computing capacity from its switching stations, using a utility billing model. "They didn't call it 'cloud computing,'" says Comstock, "but essentially having your computer processing ride in the cloud, that's an idea that goes back to the original Computer I decision. "People don't realize this, but in the 1960s, when they started moving to digital switching equipment and using computers to switch telephone calls," he continues, "the telephone companies said, 'Wait a minute. I've got this excess processing capacity in these big switches that businesses could use to do other things. I'm using it to switch telephone calls, but they could use it to do something else when I don't need it.' That's what started this whole process, in the late 1960s, on through the '70s, to the 1980 [Computer II] decision." Two of AT&T's competitors in this ancient prototype of the cloud market were IBM and EDS. They relied on the telephone network to perform maintenance on their mainframes remotely--remote administration. These two competitors became AT&T's first customers for raw compute capacity. One clear reason this business was created in the first place was to enable AT&T to gain a foothold in the emerging computer market, while adhering on the surface to the terms of its 1956 DOJ consent decree. "This has been part of the problem all along, and both the Internet community and, frankly, the FCC has fallen victim to it: People have simply made up new names for old services," remarks Comstock. "Nobody talks about mainframes anymore; they talk about cloud computing. If you look at what cloud computing is doing, it's just a reincarnation of the old mainframe system." But lawmakers today clearly perceive cloud computing service as residing on an entirely different infrastructure, both structurally and conceptually. Is this simply because the names were changed? "If people don't realize they're talking about essentially the same thing," he responds, "then obviously it becomes easier for them to be given the impression that they need new legislation. And that obviously works to certain people's advantage. But for [FCC Chairman] Wheeler, who is certainly a student of history himself, it's not hard for him to go back and find these documents. I pull them up on the Web all the time. They'll show you, yes, Congress was thinking about, talking about this, and the public was talking about this." "What I find comical--and frankly, a tragedy," Earl Comstock continues, "is that the 'expert agency' is an active participant in this misinformation to the public. Take the current net neutrality debate: They talk about a 'broadband Internet service provider.' That's not found in the statute anywhere. They don't use the terms that are already there." But transmission speeds on the order of 45 Mbps--which is still faster than many broadband customers experience today--were under open discussion as far back as the 1960s. Related Articles: Another side of net neutrality: The case in favor of Title II [Part 1] Congress punted on net neutrality, and FCC's O'Rielly missed it FCC tries 'commercially reasonable' net neutrality compromise Read more about: Computer II, Net Neutrality back to top | Also Noted | This week's sponsor is PGi. |  | Webinar: Equipping an Increasingly Mobile Workforce Tuesday, June 24th, 2pm ET / 11am PT A recent survey revealed that enterprises in the United Kingdom are adopting "Choose Your Own Device" strategies twice as often at "Bring Your Own Device" strategies. This webinar will take a closer look at the business case for CYOD and considerations in implementing such a policy. Register Today! | SPOTLIGHT ON... A workable, device-agnostic comeback plan for BlackBerry Last year, I wrote and published an extraordinarily popular piece entitled, "Reassembling BlackBerry Without Making a Single Smartphone." In summary, my idea was this: If BlackBerry's core customers are enterprises, and what enterprises desire more than cool devices are secure platforms, then BlackBerry has the know-how and the patent portfolio to effectuate a profitable comeback. Now in an interview with my friend and colleague Wayne Rash, BlackBerry President John Sims is unveiling a comeback plan that's based on four pillars. The three pillars, other than continuing to make BlackBerry phones, may ring a little familiar. Read more: BlackBerry Plans Comeback Based on Security, Enterprise Services [by Wayne Rash, eWeek] > Equipping an Increasingly Mobile Workforce - Tuesday, June 24th, 2pm ET / 11am PT A recent survey revealed that enterprises in the United Kingdom are adopting "Choose Your Own Device" strategies twice as often at "Bring Your Own Device" strategies. This webinar will take a closer look at the business case for CYOD and considerations in implementing such a policy. Register Today! > Developing for the Internet of Things: Challenges and Opportunities - Wednesday, June 18th, 2pm ET / 11am PT Cisco estimates that 50 billion devices and objects will be connected to the Internet by 2020. Will there be a role for developers in this area? And if so, how can developers position themselves in the months ahead on this nascent but potentially explosive opportunity? Register Today! | > Enterprise Connect Lync Tour - May 7 - June 24 - Various Locations > Keeping virtualized environments safe - June 2-5 - Nice, France - Sponsored by: TM Forum Live! TM Forum Live! addresses the issues and opportunities surrounding virtualization for service providers and various types of enterprises, with presentations from Deutsche Telecom, AT&T, Telefonica and more. Save up to $400 on a gold pass when you register with voucher code PW3DA2! > The TIA Network of the Future Conference - June 3-5 - Dallas, TX - Sponsored by: Telecommunications Industry Association The Conference, which highlights the intersection of markets, technology, and policy perspectives, will focus on transformation of the ICT industry as globalization, technological innovations and regulatory environments present challenges and opportunities. Topics include: 5G, SDN, Big Data, NFV, Cybersecruity, and much more. Click Here Now. | > Whitepaper: Avoiding the top three challenges of custom-coded SharePoint applications In this white paper, learn about the challenges of custom coded SharePoint applications. Then, see how you can overcome them to create the SharePoint sites you want. Download Today! > Whitepaper: IT Made Easy with ManageEngine ServiceDesk Plus ManageEngine ServiceDesk Plus is an ITIL-Ready Help Desk Software with integrated asset and project management. True to our tagline, "IT Made Easy", ServiceDesk Plus wins hands down when it comes to ease of use, out of the box settings and integration. Visit http://www.servicedeskplus.com/ to check out the list of features that come at just $995 and to download a 30-Day Free Trial! > Whitepaper: How to Add Attachment Viewing to Salesforce Read this Accusoft whitepaper to learn about the case for an attachment viewer for Salesforce. Download now. > Whitepaper: Finding ROI in Document Collaboration Read this Accusoft whitepaper to learn about the factors that make document collaboration more difficult than it should be, and about how to create a collaboration strategy that makes sense for your organization. Download Now! > Whitepaper: Delight & Engage Customers with Mobile APIs Read this success story and learn how a robust API and secure API Management powered Keep’s iOS app to become one of the most popular apps in the Lifestyle category in the iTunes App Store. > Whitepaper: Overcoming 5 Key Business Challenges for SMBs In today's highly competitive marketplace, small to medium sized businesses face challenges that are similar, and in some cases more daunting than those of larger enterprises. Moreover, smaller organizations typically have fewer resources at their disposal, making these challenges even more difficult to address. Download this whitepaper to learn five ways to overcome those challenges. > Whitepaper: 802.11ac in the Enterprise: Technologies and Strategies Download the White Paper "802.11ac in the Enterprise: Technologies and Strategies" to learn from industry expert Craig Mathias about the technologies behind 802.11ac, deployment misconceptions and review steps that every organization should take in getting ready for 802.11ac. Download today! > Whitepaper: Defense Against the Dark Arts: Finding and Stopping Advanced Threats Today's most-damaging targeted attacks don't occur by happenstance. They are carefully planned and executed by a new breed of professional adversaries. Read this white paper, Defense Against the Dark Arts: Finding and Stopping Advanced Threats to gain a practical understanding of today's Advanced Threat Landscape and strategies for detecting and stopping Advanced Threats. Download today! > Whitepaper: Longline Phishing: A new Class of Advanced Phishing Attacks The last few years have seen a dramatic increase in the use of email as a vehicle for cyberattacks on organizations and large corporations. Recently, Proofpoint researchers identified a new class of sophisticated and effective, large-scale phishing attack dubbed "longline" phishing attacks. Download this whitepaper to learn about the unique characteristics of these attacks, how they are carried out, and the alarming effectiveness they have. Download today! > eBook: eBrief | Best Practices in Mobile Application and Management Delivery Your organization knows that mobile productivity is important, and it may have already started down the road toward Mobile Device Management (MDM) and Mobile Application Management (MAM). But have you developed a holistic view of application management and delivery -- and its impact on the business? Download this free eBrief to learn about best practices for your mobile deployment. > Whitepaper: APIs Drive Opportunity Explosion Argos took bold, transformative measures to respond to market disruption from competitors selling online in addition to the move by grocers into non-food product ranges. Learn how APIs paired with a secure API Management solution can enable a digital transformation by delivering content and purchasing capabilities to customers any where at anytime. Download Today! > Whitepaper: Supporting VDIs and Thin Clients Companies have already begun deploying VDIs and thin clients (like Google's Chromebook) on a massive scale. The low-cost, easily deployed workstations present a significant cost savings for companies, but require unique tools to support them. This whitepaper, written by Proxy Networks, outlines the best way to do that. Download now. > Whitepaper: Four Ways to Improve IT Efficiency The role of the help desk within businesses has expanded considerably over the last decade, becoming an integral piece of the overall corporate strategy. In this whitepaper, Proxy Networks outlines the best way to align your IT department with that strategy in order to improve overall departmental efficiency. Download now. | |
No comments:
Post a Comment
Keep a civil tongue.