Draymond Green pushes back on criticism surrounding DeMarcus Cousins’ defense

first_imgOAKLAND – For a man who usually loves talking trash, Draymond Green became agitated over all the punchlines and commentary surrounding DeMarcus Cousins’ recent defense.In the Warriors’ 122-105 victory over the Denver Nuggets on Friday at Oracle Arena, Cousins finished with 13 points on 5-of-11 shooting along with six blocks and three steals in 28 minutes. So, Green pushed back with the kind of force he usually displays on the court.“Everybody wants to talk [junk] about DeMarcus’ defense,” …last_img

Getting people to help themselves

first_imgIn January 2016, Musa Mathebula established the NPO Musa Projects. His organisation aims to create hope and empower those who cannot provide for themselves so as to improve the quality of life in these affected communities.Musa Projects’ mission is to partner with both private and public sectors and find ways to improve the quality of life in impoverished communities. (Image Musa Projects)In just over a year since its establishment, Musa Mathebula’s NPO Musa Projects has collected and distributed sanitary pads to schools in rural communities; provided school shoes and toiletries to students; collected and handed out clothing to destitute families; assisted families with food parcels and other household items; and helped homeless people with clothing and open day gatherings.Some of the work Musa Projects has been involved in includes donating cement and tiles to Mbabalana Primary School in Port St. Johns in the Eastern Cape; donating baby formula and clothing to a family in Soweto in Gauteng; offering food to the victims of flash floods in Alexandra in Gauteng and handing out sanitary pads, shoes and toiletries to schools in Barseba Village in the North West.Go to www.musaprojects.org.za to play your part in this generous movement.last_img read more

Big Memory: An Interview with Terracotta CEO Amit Pandey

first_img8 Best WordPress Hosting Solutions on the Market I did an interview this morning with Terracotta CEO Amit Pandey about the fascinating new dimensions of in-memory data and its use for search. How excited I was when I learned that the public relations person had been transcribing the interview. It felt like a Mad Men moment!It turned out well enough to include in its entirety. Pandley covers the fast evolving world of big memory. The term is catchy isn’t it? Amit certainly gives a pitch here but we should expect that in a format like this. The interview is insightful in it shows how fast data can now be accessed and what it means for the new world of real-time intelligence.Alex Williams: So tell me what you’re seeing in the market these days?Amit Pandey: Our most interesting thing is — I had briefed you on BigMemory, right? It has surpassed our expectations, which were high, in terms of how useful it is and how many people want it. Everything from the name itself, which stirs customer imaginations–the choice of name was quite good because it resonates with people’s desire to use more memory. You’re getting servers that are much larger with more memory. Most CIO offices feel an imperative to make data more accessible in real-time. Most people know that really the only way to do that is put it in-memory, where the applications can reach it very quickly. The name grabs people’s imaginations, they dig in and find it is what they thought it was, and they start doing proof of concepts and we go from there. What we’ve seen is that with this product, with BigMemory, we have much more data on its performance now, and we’re able to do generally about 4-10x the level of density of other products and even compared to our own older, non-BigMemory products, in terms of how much hardware and how many servers it takes to achieve the level of in-memory data that customers want.I’ll give you an example. There’s a large bank that had its data distributed across 30 servers to achieve the amount on in-memory data they need. With BigMemory, they were able to bring it down to two servers. That was a huge win for them because the cost of administration suddenly dropped. The complexity of running 30 servers isn’t even comparable to the complexity of running two.Williams Okay, so that’s an example of that 4-10x density?Pandey: Yeah. In this case it was a 15x increase. But theoretically it’s possible to do that use case on one server. We did tests with Cisco’s UCS server and we’ve been able to run 350 gig on a single UCS server. That’s bigger than a lot of people’s entire database. In the past, I would say 6-8 gigs per application instance was about the most you could do.Williams: You said that’s bigger than most people’s database?Pandey: 350 gigs? Sure. There are databases that go into terabytes and petabytes, but I would say the average database is probably around that size or smaller. So because in one instance of an application we can get that kind of in-memory storage, it’s a huge win. We’re seeing that’s very exciting to people because it simplifies their architecture and makes it a lot more elegant. I think you remember the reason for the 6-8 gigabyte limit was that the Java memory manager is very poor at handling bigger sizes, and BigMemory by-passes that. So that’s just a quick recap.In general, the whole concept of doing things in-memory and in real-time is big on everyone’s agenda–real-time analysis, etc. The natural step for us was if we can put this much data in memory, people will want to do things with it. They’ll want to do searches and queries of that data to find out what their customers are doing, what their customers need, and so forth, so real-time analysis of that data is critical. We’re releasing a native search capability for Ehcache which basically lets you search as much memory as you would need.Here’s an example of the kinds of searches customers are doing: we have a SaaS customer that does logistics for fast food restaurants. Normally, they run reports against a database–there are two issues with that. One is that they were not getting real-time data because they would batch stuff and write it back to the database and run their reports on a four hour basis or at the end of the day. What customers really need is to find out at any given moment how many hamburger buns they have in inventory and how many have been used up so they know where they are and can do real-time management and lower costs and make sure they don’t run out, etc.To do that against the database was very slow because it meant going across the network to the disk. It also meant the database was very overloaded and it meant spending tons of money expanding the database to get this done. To work against the database was taking them about 35-40 seconds to do some of these reports. But BigMemory with Search dropped their times from 35-40 seconds down to less than half a second. Williams: So what were they were they trying to understand?Pandey: Search was for inventory items like hamburger buns or other food items, and they were trying to figure out in real-time how much has been used right now and how much is left. They needed real-time analysis of data and doing real-time analysis on a database was both very slow and very expensive. Williams: So instead of having to do that against a database, they can now use in-memory to do that a lot more efficiently?Pandey: Exactly, because the data is right there. You don’t need to make a round trip to go across the network to the database. The cost is a lot lower because they’d have to buy a lot more database licenses to achieve the speeds they need.Williams: It’s interesting how that can affect the supply chain, too.Pandley: Sure, it can. Their customers won’t tolerate those kinds of search speeds. So the logistics company was looking for another solution. Without BigMemory, this company would probably have had to spend a lot of money upgrading to a really expensive solution like Oracle Exadata or something like that.Williams: Where is this all going? Pandey: Our customers use it for all kinds of things: we have travel reservation systems running on Ehcache, websites, online gaming systems, back-end medical patient records. In any places where you need to do a quick search and query of what your customers are doing in real-time, you can do that. You could do a search and say, how many people are currently logged in and playing my game who are 25 years old and live in Oklahoma, because I want to do a promotion in Oklahoma for those people right now. You can do that with a database, but it would be very slow and very expensive. With in-memory data you can do that really fast, and target those people quickly.One thing I do want to make clear, Alex: we’re not saying we are replacing analytics in the database. We’re not providing all the heavyweight reporting capabilities that business intelligence tools offer today for databases and we’re not doing all the analytics. But what we are providing is a very simple, powerful, lightweight search where you can do real-time analysis of customer behavior and things like that. Over time we’ll make it a richer reporting set.We’re working with BI and other vendors to provide hooks so that they can run their stuff against ours. It will become richer over time. So right now, we provide a simple lightweight thing that’s extremely useful for real-time analysis but you couldn’t really say it’s a business analytics tool yet because those have been developed over the years and the term “analytics” is loaded. We’re very careful to use the term “real-time analysis”. Over time, in the next 3 to 5 years, I see this getting richer. You’re already seeing all these companies (SAP, etc) talking about merging analytical and transactional together in one architecture. What we are doing is essentially that, we’re taking baby steps toward that.Williams: With tablets available, you can see this data visually, that has an added impact.Pandey: Yeah. The great thing is if you put search capability in the application, it’s sort of independent of the platform that uses it–it could be a phone, tablet, etc. Your platform can be leveraged by any of these devices. Obviously, mobile devices would be a big part of that.Williams: Thanks for your time!(Photo: Amit Pandey, CEO, Terracotta) alex williams Related Posts A Web Developer’s New Best Friend is the AI Wai…center_img Tags:#cloud#cloud computing Top Reasons to Go With Managed WordPress Hosting Why Tech Companies Need Simpler Terms of Servic…last_img read more

Yes, We Can

first_img Start Free Trial Already a member? Log in This article is only available to GBA Prime Members Sign up for a free trial and get instant access to this article as well as GBA’s complete library of premium articles and construction details.center_img After Henry Ford perfected the automobile assembly line, U.S. industry experienced several decades of explosive growth. Although industrial expansion was interrupted for a decade during the 1930s, it roared back during the ’40s, ’50s, and ’60s. Historians have proposed several explanations for these decades of growing productivity, including the country’s high rate of immigration and access to cheap energy and natural resources.Another possible factor is especially intriguing: Our large farming population produced several generations of skilled tinkerers who excelled at mechanical innovation. According to this theory, rural American teenagers in the 20s, ’30s, and ’40s spent hours in the barn tinkering with tractors, hay balers, and wind generators. These farm mechanics could make and fix just about anything using a variety of parts from broken-down equipment and the local dump. Many of these self-taught tinkerers later became engineers and industrial innovators.“Our industry will be crippled”The decades of increasing U.S. productivity might be called the “Can Do” era. Sometime around 1970, however, the tide shifted. At the risk of oversimplification, it’s tempting to say that U.S. industry made a transition to a new era — the “It’s Impossible” era.Among the leaders of the “it’s impossible” camp are U.S. automakers. For decades, automakers sent an army of lobbyists to testify before Congress that auto safety, efficiency, and clean-air mandates would cripple the industry and make cars unaffordable. Among the innovations that these lobbyists resisted were mandatory airbags, catalytic converters, and improved gas- mileage standards.Once Congress finally got the courage to increase safety, mileage, and clean-air standards, guess what happened? The cost of the innovations quickly dropped.Appliance manufacturers imitated automakers by hiring lobbyists who claimed that proposed refrigerator-efficiency standards were burdensome and unaffordable. Once new efficiency standards were enacted, however, manufacturers… last_img read more