Shadow-Soft, a leading provider in open source technology solutions, has released new features Buyopensource.com to ease the management and purchase of software subscriptions and licenses. The latest feature added to Buyopensource.com is a Cart to Quote module which allows the Shadow-Soft sales team to build, configure, and send an online quote to a user’s email. A user account is created for the customer and they can access a quote directly from the login. Once reviewed, a customer is a click away from purchasing.
Shadow-Soft’s e-commerce manager states, “We have found that purchasing online is convenient for many customers but selecting which software items they need can be cumbersome. Utilizing this new feature, our customers have a Shadow-Soft specialist configuring a cart for them and they can choose to purchase when they are ready. This feature has already assisted many customers that have smaller software renewals and want to purchase online using a credit card.”
Shadow-Soft continues to find ways to create simple processes for its customers and staying ahead of the curve. Contact Shadow-Soft today to request a quote that can be issued from Buyopensource.com.Read more ...
By Andrew Brust
It ain’t just Linux
Red Hat Enterprise Linux (RHEL) is arguably Raleigh, North Carolina-based Red Hat’s flagship product, but the operating system arena is not by any means its only focus. Red Hat also has big irons in the storage, cloud and developer fires, and its Big Data strategy announcement addressed all three of these.
Big Data is now a relevant factor in the entire enterprise software stack.
One could argue that the crux of Red Hat’s Big Data manifesto focuses on the hybrid cloud. Red Hat’s Big Data narrative entails customers working on Big Data pilots/proofs-of-concept in the public cloud today, with the need to put those projects into production in the on-premises, private cloud in the near future.
I’m not sure if this narrative is quite as univeral as Red Hat would have us believe, but the motivation Red Hat derives from it is nonethless laudable: to make certain that Big Data projects can move seamlessly from the public cloud environment to the private cloud, or vice-versa, without “re-tooling.”
What defines the strategy?
In order for that roundtrip to be possible, and in an environment built on Red Hat Enterprise Linux, Red Hat Storage, JBoss Middleware and the OpenShift cloud platform (as well as the OpenStack cloud platform overall), Red Hat announced the following initiatives:
Who the strategy involves
Red Hat also announced it will be forging hardware and software partnerships with an eye toward developing a full ecosystem around its Big Data approach. One deliverable from these partnerships will be reference architectures that the company said could be used as “cookbooks” by enterprises to build out Big Data infrastructure with greater assurance of success.
What it Means
Red Hat rightly pointed out that the majority of Big Data projects are built on open source software (including Linux, Hadoop, and various NoSQL databases) and so it’s fitting that such an important company in the open source world as Red Hat would announce its Big Data strategy.
What’s especially significant here is that Red Hat is also an Enterprise software company, and it articulated a strategy aimed at making Big Data part of the mainstream Enterprise stable of tools and technologies. It’s a big step in the maturation process for Big Data technology. That maturation will seemingly figure heavily in the tech world in 2013.Read more ...
On January 21, 2013, OC Systems released version 3.4 of RootCause Transaction Instrumentation (RTI). Significant enhancements have added to this product that helps monitor web applications through JBoss Operations Network:
Interested in evaluating RTI? Register here for a free 31 day evaluation.Read more ...
By Christopher Mims — December 5, 2012 (QZ.com)
Oracle is a $156 billion corporate IT company with a big problem on its hands: in a recent survey of senior information-technology executives in charge of IT budgets greater than $50 million, 85% are trying to figure out how to get out of expensive license agreements with Oracle, reports the Register.
Known as “enterprise license agreements” (ELAs), the contracts specify that if a customer changes the software or support provider for one of the many Oracle products large businesses typically pay for, the customer still owes Oracle the same amount of money. That’s because ELAs are company-wide licenses that cover the totality of Oracle products a customer uses. As a result, many large corporations have had little incentive to explore alternatives to Oracle, until now.
With cloud-based service providers like Salesforce.com offering cheaper alternatives, Oracle could be in trouble. Of the 85 leading IT executives surveyed by investment bank Cowen and Company, at least a quarter aren’t just thinking about getting out of Oracle’s expensive all-you-can-eat contracts, they’re actually looking at switching to one of the company’s competitors.
So far, Oracle’s revenue has yet to reflect this sentiment–third-quarter results showed stronger-than-expected sales of licenses for the company’s business databases. Oracle also has a powerful lock on existing customers, who have spent years, and in some cases decades, building mission-critical services around Oracle’s services. Getting that data out, and reconstituting it with a competitor, is a considerable headache that could keep Oracle living off a steady stream of corporate licenses for years to come, even if future sales reflect the intentions of the IT executives in the survey.Read more ...
By: Douglas DeLoach
In August, Shadow-Soft, LLC, an Atlanta-based open source software (OSS) solutions and consulting firm, announced the company had been approved by the General Services Administration to begin contracting with the U.S. government.
Shadow-Soft executives anticipated the approval would open up a path to significant growth since the government has been expressly turning toward open source software as a means to saving taxpayer dollars while keeping critical systems technology current.
“We are on track to do more than $10 million in total revenue in 2012,” said Shadow-Soft CEO James Chinn. “We think the GSA schedule will contribute about 30 to 40 percent additional revenue in 2013.”
Shadow-Soft provides commercial, public sector, and now government enterprises with products, services, and integration expertise designed to support the deployment of applications in an open source environment, the vast majority on a Linux platform.
Like most open source proponents, Shadow-Soft touts its technology as offering more bang for the buck, thanks to generally lower cost of development and deployment, and its highly scalable architecture.
“In today’s tough economic conditions, our customers are having to find a way to do more with less, but we see open source software doing much more then just reducing expenses,” said Shadow-Soft co-founder and Chief Operation Officer Erik Wallin. “OSS has proven to be more scalable and easily interfaced than traditional, proprietary, high-priced solutions, which will be increasingly important as more organizations transition to the cloud.”
In the months since the GSA approval, Shadow-Soft has opened a Washington D.C. office, currently staffed by three employees, to manage GSA-specific customers and other government business. Additionally, in September, Shadow-Soft introduced Puppet, the first of many planned products for federal, state, and local government entities that choose to follow the open source path.
Developed by Portland-based Puppet Labs, Puppet is a program that helps systems administrators manage IT infrastructure and data center tasks from provisioning and configuring to patch management and compliance.
Puppet can be scaled to almost any environment from tens of servers to thousands of servers located either physically on-site or virtually in the cloud. Current Puppet customers include Citrix Systems, Inc., Shopzilla Inc., Match.com, Oracle/Sun, Twitter, Yelp, eBay, JP Morgan Chase, Bank of America, Google, Disney, and Viacom Inc.
“In effect, Puppet Labs is outsourcing their sales staff to us because we developed a special area of expertise,” Chinn said. “On the other side of the deal, the government gets to lock in a price and negotiate a delivery schedule based on its evolving requirements, which makes wide-scale implementation much easier for everyone.”
The GSA schedule is a five-year contract (with the potential for three additional five-year extensions), which streamlines the procurement process and gives companies access to various contracts and agencies as an approved supplier. One of the goals of the GSA schedule is to help level the playing field for smaller companies or minority/ women-owned businesses to compete with bigger companies.
“The government can be a terrific customer with lots of potential for repeat business and with no credit risk, since the government usually pays on time,” said Robert M. Gemmell, director of the Herman J. Russell Sr. International Center for Entrepreneurship in the J. Mack Robinson College of Business at Georgia State University.
“According to government figures, among the roughly 20,000 companies listed on various GSA schedules about 80 percent are small businesses,” Gemmell said.
The process for getting on the GSA schedule is straightforward enough, albeit lengthy, and not devoid of effort. Companies must first identify the appropriate schedule based on their product line and business strategy.
The paperwork is extensive, terminology may be unfamiliar and assembling the requisite documentation may present a daunting task. Gemmell cautioned GSA schedule aspirants not to underestimate the learning curve or the challenge of doing business with the government.
“Some entrepreneurs hire consultants and essentially contract all or part of their GSA solicitation response,” Gemmell said. “I’ve seen fees ranging from $10,000 to $25,000 for such services, so entrepreneurs should carefully review the track records and credentials before hiring anyone.”
Early on in the company’s history, Shadow-Soft partnered with Red Hat Inc., the Raleigh, N.C.-based developer of Red Hat Enterprise Linux operating system.
Using an open source/open code business model, Red Hat supports freeware software projects, and provides operating systems platforms, middleware, applications, management products, and support, training and consulting services.
“Shadow-Soft is truly a natural extension of our sales force,” said Matt Simontacchi, managing director of Mid-Atlantic and Southeast regions at Red Hat. “We both have seen tremendous growth in this space as more companies are looking to harness the power of community-driven innovation and break free from vendor lock-in.”
In the summer of 2008, Chinn and Wallin, who at the time were co-workers at on IT company, saw a gap in the technology market, so they quit their jobs to start a company that would meet the opportunity. In just four years, Shadow-Soft has moved into a larger facility and now employs 18.
“We’re looking forward to additional capabilities on our services team, and new product offerings in our line,” Chinn said. “We’re excited about the future.”Read more ...
ATLANTA, Georgia and PORTLAND, Oregon– September 25, 2012 — Puppet Labs, the leading provider of IT automation software for system administrators, and Shadow-Soft, a leading Public Sector Open Source solutions provider and General Services Administration (GSA) Schedule Contract holder, today announced a partnership wherein Shadow-Soft will be the strategic GSA schedule distributor for U.S. Government, State and Local organizations that choose Puppet Lab’s leading IT automation solutions.
Puppet is a revolutionary configuration management tool that provides systems management and datacenter automation for the enterprise and cloud. Puppet enables system administrators to collect operational data and set configurations across heterogeneous environments of Windows, Linux, and Unix systems using a simple declarative syntax. Thousands of organizations use Puppet to manage their infrastructure, including Zynga, Citrix, Shopzilla, Match.com, Oracle/Sun, to name a few. Under this agreement, Shadow-Soft and Puppet Labs will jointly market and sell the Puppet Labs solutions set to Federal, State and Local Government agencies.
“This partnership between Shadow-Soft and Puppet Labs combines the experience and expertise of Shadow-Soft’s end- to-end enterprise open source solution and consulting practices with Puppet Labs ubiquitous IT automation technology,” said Scott Campbell, Vice President of Sales for Puppet Labs. “The partnership will help government agencies streamline the acquisition and use of Puppet Enterprise, enabling them to manage next-generation IT infrastructure on-premise or in the cloud.”
Puppet Labs’ solutions are available on Shadow-Soft GSA Schedule # GS-35-F-0524Y. Please contact Shadow-Soft at firstname.lastname@example.org for additional information.
About Puppet Labs – Puppet Labs, Inc.(www.puppetlabs.com) was founded in 2005 and shipped the first release of the open source Puppet Project later the same year. Puppet’s popularity has since grown to where it now is responsible for managing millions of nodes across thousands of companies and organizations, both on-premise and in the cloud. Now numbering over one hundred employees and based in Portland, Oregon, Puppet Labs is backed by investors Kleiner Perkins Caufield & Byers, Google Ventures, VMware, Cisco, True Ventures, Radar Partners, and Emerson Street Partners.
About Shadow-Soft – Shadow-Soft is a leading provider of Open Source Software, Cloud Computing, and other market disruptive solutions. With offices in Atlanta and Washington, D.C., Shadow-Soft strives to make freedom, flexibility, and Open Source the building blocks of its enterprise technology solutions. As the Red Hat Catalyst Partner Partner of the Year (2011), the Shadow-Soft team and partnerships are comprised of the Open Source industry elite focusing on the value and performance Open Source delivers to their commercial, educational and government customers. Visit www.shadow-soft.com to learn how Shadow-Soft can help you reduce technology costs and drive innovation by migrating from proprietary to Open Source technologies.Read more ...
Data is growing exponentially. Web content has created an insatiable demand to increase storage capacity and manage data more efficiently. IDC’s report in 2011 predicts that the amount of data in the world will surpass 1.8 zettabytes. What is a zettabyte? One trillion gigabytes. The projected forecast for data consumption by 2015 is nearly 8 zettabytes. This astounding amount of data will cause the same problems enterprises have been dealing with for years. Where is the data going to be stored and how is it going to be managed?
Most of this data is unstructured or semi-structured data which would be a great fit for Red Hat Storage (RHS). This software based appliance can run on any traditional x86 server platforms. The RHS platform provides:
RHS deploys in minutes for scalable, high-performance storage in your datacenter or public cloud. RHS brings industry leading value because it provides a highly available and resilient place for unstructured data to live. RHS can scale with the data explosion without suffering in performance. As all Red Hat products, RHS is open and has no proprietary data formats. These three items - VALUE, SCALE, and OPEN - are what set Red Hat Storage Server sets apart from the current SAN and NAS scale-up storage solutions you may be using today.Read more ...
Written by Derrick Harris
Disney is a massive company, but when it comes to its big data platform, the entertainment conglomerate looks a lot like a startup. Kind of, that is. By the sheer power of its will (and ingenuity), a small team has been able to craft a large custom platform out of Hadoop, NoSQL databases and other open-source technologies. But for better or for worse, doing big data at such a large company means playing by a different set of rules.
When it came to putting a big data platform in place, Arun Jacob, director of data solutions in the Disney Technology Solutions & Services group, told a room at the IE Group Big Data Innovation conference in Boston on Thursday that Disney chose to build something from scratch rather than buy software from a large vendor. Cost certainly played in a role, but really it was flexibility that made the decision.
In order to provide the most value to the company, Disney’s big data platform has to be everything to everyone, which it turns out is a tall order. Initially, Jacob said, “We treated ourself like a small consulting organization and we had something to sell.” When a division wanted it to use the platform for a particular function, Jacob would say yes and then get busy actually figuring out how to build it.
Architecturally, it’s all about being able to recompose the path data takes through the platform and the components that are used for each particular purpose, or being able to easily replace pieces altogether if something better comes along. The Disney platform has a foundation of Hadoop, Cassandra and MongoDB complemented by a suite of other tools for particular use cases. The operations team uses the platform to view, analyze and index error messages, while another division runs a recommendation engine on top of it. Application developers get the high-throughput, low-latency data access they need, while the analytics team has the higher-latency data access it requires.
However, although Jacob wanted to keep costs down with open source software, he did have a luxury that most startups don’t — a budget for outsourcing and the occasional product. When he needed support with a Hadoop cluster, he could call Cloudera. When an implementation of Solandra (an open source search engine built atop Solr and Cassandra) tipped over under the weight of Disney’s scale, he bought the enterprise edition of DataStax’s Cassandra-based product (Solandra’s creator had since taken a job with DataStax and was expanding upon Solandra’s capabilities in DataStax Enterprise).
The Solandra incident actually underscores the tradeoffs that come when you use free open-source software and don’t reach for the checkbook at any sign of trouble. “You pay for [open-source projects] late at night, you pay for them by learning to run them, you pay for them by reading people’s source code who even if you could read it, it still doesn’t make any sense,” Jacob said. But those things can be overcome if you’re willing to put in the time.
And at a company the size of Disney, those problems — and whole lot more — have to be overcome. For example, Jacob explained, you can fudge your way around things like fault tolerance, high availability and security when you’re standing up a deployment, but you do have figure out a way to achieve those things eventually.
You also have to make systems built on open-source software consumable by everyone who needs to use them. That means it’s not enough to just build a scalable and stable system; the system also has to be easy enough for thousands of internal developers of all types and all skill levels to use. In a six-person startup, Jacob said, it’s easy enough for everyone to just learn Hadoop in a month and then start using it, but that’s not the case in a large enterprise.
So his team made it easy.
In order to “remove the excuses” for business users not loading their data into the system, they just need to point the custom-built user interface at their files. (Disney’s platform is growing at 5TB a day, and there are still many other types of data it needs to house, Jacob said.) Because they’ve built wrappers around the technology, Jacob’s team doesn’t talk about Hadoop and MongoDB to internal users, only about analytics and queries. It built client frameworks in a bunch of programming languages so developers can interact with the platform without writing RESTful API calls.
In some cases, the team decided to hide the platform’s complexity from users; not to facilitate its use, but to keep loose-cannon developers from doing something crazy that could take down the whole cluster. It could show them all the controls and knobs in a NoSQL database, but “they tend to shoot each other,” Jacob said. “First they shoot themselves, then they shoot each other.”
Still, after all the work he put into building Disney’s big data platform, it’s not exactly a process Jacob is hoping to repeat as the platform evolves. The tools for managing big data are getting better, he said, so he still does a build-versus-buy analysis when it’s time to make a change. Building custom tools is fine when you don’t have a choice, but it’s not always wise when buying something could save untold man-hours and headaches.Read more ...
Written by Steven Aftergood
A new U.S. Army publication provides an introduction to open source intelligence, as understood and practiced by the Army.
“Open-source intelligence is the intelligence discipline that pertains to intelligence produced from publicly available information that is collected, exploited, and disseminated in a timely manner to an appropriate audience for the purpose of addressing a specific intelligence and information requirement,” the document says.
“The world is being reinvented by open sources. Publicly available information can be used by a variety of individuals to [achieve] a broad spectrum of objectives. The significance and relevance of open-source intelligence (OSINT) serve as an economy of force, provide an additional leverage capability, and cue technical or classified assets to refine and validate both information and intelligence.”
See “Open-Source Intelligence,” Army Techniques Publication (ATP) 2-22.9, July 2012.
The new manual is evidently intended for soldiers in the field rather than professional analysts, and it takes nothing for granted. At some points, the guidance that it offers is remedial rather than state of the art.
For example, “if looking for information about Russian and Chinese tank sales to Iraq, do not use ‘tank’ as the only keyword in the search. Instead, use additional defining words such as ‘Russian Chinese tank sales Iraq’.”
But the manual reflects the ongoing maturation of open source intelligence (OSINT), and it contains several observations of interest.
“The reliance on classified databases has often left Soldiers uninformed and ill-prepared to capitalize on the huge reservoir of unclassified information from publicly available information and open sources,” the manual states.
Classification can also be a problem in open source intelligence, however, and “concern for OPSEC [operations security] can undermine the ability to disseminate inherently unclassified information.”
“Examples of unclassified information being over-classified [include] reported information found in a foreign newspaper [and a] message from a foreign official attending an international conference.”
Therefore, pursuant to Army regulations, “Army personnel will not apply classification or other security markings to an article or portion of an article that has appeared in a newspaper, magazine, or other public medium,” although the resulting OSINT analysis might be deemed “controlled unclassified information.”
Somewhat relatedly, the Department of Defense this week published a new Instruction on DoD Internet Services and Internet-Based Capabilities, DODI 8550.01, September 11, 2012.