Showing posts with label New Technology. Show all posts
Showing posts with label New Technology. Show all posts

Monday, February 12, 2018

Part Three: The Analytics Strategy and Roadmap - A Use Case Driven Plan to Incrementally Build an Analytics Capability Powered by Big Data

In the previous blogs of this three-part series, I addressed a clear analytical divide that has grown in the industry, where relatively mature BI shops are at a definite advantage over most organizations that have yet to fully realize an analytics capability powered by big data. The first blog in this series identified the critical capabilities needed for analytical success with big data, and the many impediments, both technical and organizational, that are holding companies back.
Building on this idea in the second blog, I outlined why the attempt to take a ‘big bang’ approach to big data, by first putting all of the enterprise’s data into a data lake, is not likely to succeed because it returns little ROI in the short run and has major investment, governance, and skills requirements. Instead, I proposed establishing a multiplatform data warehouse environment (DWE) with an architecture pattern that's designed to accommodate immediate used cases with specific goals and measurable ROI, so the program can fund itself along the way.
In this approach, the requisite analytics capabilities will be gained through a managed transformation, an incremental build up in a phased approach, where the big data journey is mapped in clear, achievable but increasingly challenging milestones that induct the different nature and types of big data. The strategic roadmap on big data will be formulated based on these early successes, with more participation and sponsorship of the business when it starts to see value from this technology. That will help refine the tactical aspects of the strategy execution.
In this blog, I present a four-phased roadmap to get there, each phase building the pre-conditions to succeed with the next. The phases will of course overlap when work on a previous phase continues with other use cases. I will cite telecom use cases in the customer experience domain only to illustrate the comparative progression and analytical maturity in each phase.
We have presented a practical roadmap to big data and analytics adoption based on successful practices in industry. This plan presumes nothing and builds on successes at each phase generating the pre-conditions for the next.

The use-case driven approach starts with more technical IT driven challenges and matures eventually to departmental operational decisions and finally to strategic decision support. The maturity looks like this: Data Warehouse Off-Loading, 2) Operational BI, 3) Operational Analytics and 4) Strategic Analytics. The use cases will have to be evaluated in two major dimensions: the implementation capability and capacity needed, and the degree of organizational change required relating to their impact on current business processes.
In the early part of the transformation, the big data initiatives will be more technical in nature and localized at the department level. They will require the least additional skill and have positive, if minimal impact, to business processes. In the later stages, the evolving use cases will have wider business impact and will demand more capacity and technical and organizational capabilities in big data. The final stage involves analytics adoption for use in organizational strategic planning.
As the later stages involve use cases that are more operational and strategic in nature, which can impact processes across many departments, they will demand a more robust organizational change management program to manage the change across different participating groups and additional governance requirements. Large companies will have multiple big data teams, and as the organization builds more advanced big data capabilities, teams will need to come together for interdepartmental use cases.

Phase 1: Offload data and workloads from legacy systems and the enterprise data warehouse

Like most other IT systems, as data warehouses age, their design and enabling technologies can become un-scalable in terms of their economics and performance. Adopting multiplatform data warehouse environments would solve many data storage and performance issues, which is why it is one of the strongest trends in data warehousing today. In this phase, high volume detail transaction data storage and processing will be off-loaded to a Hadoop platform, reducing the storage and computing resource requirements of the relational data warehouse platform. From a business viewpoint, this is a non-disruptive task. It preserves existing investments in data warehousing, and (when done well) it extends the life of an expensive and useful system.
The off-loaded detail data, which is hardly exploitable in a traditional RDBMS will also become amenable to analytic exploitation because of the linearly scalable architecture of Hadoop, increasing the value of these detail data to the business: they will be able to get valuable insights from this detail data with the right questions. Organizations can also explore the possibility of monetizing these detail data. For example, location-based and movement-over-time data can be obtained from Call Data Records in the telecom industry. Inducting mainframe data and/or offloading the processing to Hadoop, active archiving of historical data also is other example of IT use cases for this phase.
This phase will require a relatively small investment in the big data cluster: between 6-10 nodes depending on the data volume to be off-loaded. In terms of investment and ROI, typically this phase will pay for itself in terms of reduction in infrastructure costs, improvement of performance of ETL processes and reports, and in the additional value in detail data.
The foundation of big data capabilities for the organization will be laid in this phase: IT will get a foothold in Hadoop skills on familiar existing structured data. Data governance policies will be applied to the data off-loaded to Hadoop, and in doing that, the finer aspects of practicing the data governance principles and policies will be sorted out, again, on familiar territory of the data. An Agile development methodology with DevOps should be inducted in this phase, delivering value as early as possible while streamlining the support functions to the big data program.

Phase 2: Operational BI (event processing)

While the first phase is based on batch processing, the next will be based on near-real-time and subsequently real-time processing—starting with processing structured data, progressing to semi-structured and unstructured data.
It can start with rule-based event processing use cases on structured data (like fraud detection for telecom), which can happen in near real time, and then move on to processing more voluminous structured data in a more real-time basis (like identifying potential Mobile Switching Center failures and re-routing more profitable customers to a different Mobile Switching Center in real-time to avoid service degradation).
Semi-structured and unstructured data can be inducted for real-time event processing after these successes. Some telecom use cases could include analyzing customer interactions captured by a call center application to identify the key problems customers are complaining about. Sentiment analysis on this data can provide the intensity of customer dissatisfaction around these problems. The text analytics can be further improved by transcribing the recorded calls and using transcripts for this analysis. Further, voice analytics can be applied on recorded calls to measure the customer’s mood associated with the complaints. These analyses will not only provide statistics on overall complaints, but will be able to identify dissatisfied high-value customers in real time.
In this phase, the algorithms are mainly rule-based and fairly deterministic in nature, and the use cases can be limited by actionability and deployment confined to a single department, typically the departments that are showing more traction with the big data initiative, thus improving the chance of building more accurate models and ensuring deployment and use in operations.
The organization will develop the Hadoop data integration skills for different types of data in this phase. They will now have gradually developed a fairly advanced data governance capability and should have established data management policies and processes around it for these more exotic data types. There will be more pervasive use of these data sets by analysts through self-service exposed in analytical sandboxes. The induction of these new data sets will be closely linked with business use cases, data management practice (in terms of data ownership and accountability), ensuring enough data quality, capturing business metadata, security and privacy aspects, etc. Ideally it should not have great impediments and should have the requisite backing from the quarter of the business that will benefit from the use case. The data management process should be formalized through these implementations, developing the requisite controls and artifacts.
These parts of the business will now have adopted the use of big data and would have started realizing benefits out of it. The organization will now be at the “Analytical Practitioners” level. The big data cluster will get much larger with induction of these new high-volume data sources, but ideally it will be funded by the departments deploying the use cases.

Phase 3: Operational analytics

In phase 2, the data lake has been hydrated with varied structured, semi-structured, and unstructured data, and insights have been obtained from them. Typically, these datasets will progressively provide the customer 360-degree view, aggregating data from all customer touch points.
In phase 3, these insights can be combined using advanced analytic techniques to obtain predictive operational intelligence. For example, customer churn models will be deployed based on various types of data obtained on customer interactions in the previous phase. Campaign management algorithms can be refined based on this addition information. Call center data volume in different categories can be forecasted based on historical patterns.
Until phase 2, the big data program was tactical and bottom-up. Now it needs to be met with a top-down strategy to be effective at this next level. The input data as well as actions out of the insights from the use cases will typically span across departments. Hence, the big data program will need to have strategic direction and sponsorship at this phase, ensuring leadership support for identifying which operational areas analytics can be used to improve customer experiences most effectively, and to ensure that the insights obtained drive and enhance the business processes involved.
This leadership is essential for gaining buy-in from managers in sales, service, and support functions applying such insights. Through such leadership, analytics professionals will be able to collaborate with business managers to refine the algorithms and gain feedback about what worked and what did not in applying the analytics in real-world sales, service, and support. Active participation from the business will also be needed in data governance in respect to usage of data and the related privacy issues, which will be more prevalent in this phase. But, success in the previous phases should ideally ensure this participation and sponsorship.
The role of data scientists and domain specialists will become critical in this phase, and the company will have to invest in these skills. The organization is now moving towards being insight driven. Here, the business owners are putting faith in the predications and forecasts from the predictive models, and the organization has the critical skill base and a robust data management capability. The people, the process, the data, and the technology is in place. They have become “Analytical Innovators”. Organizations will catch up with the advantage of competitors and probably break away from them based on the success in this phase.

Phase 4: Strategic analytics

In this phase, adoption of analytics pervades the organization, and the most critical business processes become insight driven. Now the CXOs consult the analytical insights in their decisions, and more strategic decisions also take the big data ‘outside in’ view into account. The enterprise planning becomes more agile by including external drivers derived from the big data, making it more responsive to changes in market conditions and customer behavior. For a telecom company, this would mean analytics driving their strategic planning on product mixes, new products, cell, tower planning, etc.
The gradual transformation of the decision-making culture culminates in use of data to make smarter business decisions to drive creativity and innovation, bringing it to the frontiers of the practice of analytics. At this point, analytics having a huge impact on the bottom line is an established correlation.

The next step in the big data journey

Success with advanced analytics has many daunting pre-requisites that put the relatively mature BI shops at clear advantage, yet an agile management culture tuned to the rapidly changing market conditions is going to be a pre-requisite to survival, if not success, in the next decade—adopting analytics is no longer a choice.
We have presented a practical roadmap to big data and analytics adoption based on successful practices in industry. This plan presumes nothing and builds on successes at each phase generating the pre-conditions for the next. It starts from IT use cases with no business impact, progressing to more and more impactful use cases as the requisite capability develops. This generic and high-level roadmap can be customized for an organization, depending on its business challenges and opportunities, its current analytical maturity, and its internal challenges towards big data adoption.

Thanks to Suman Ghosh from TCS for enlightening us on the concepts.

Saturday, October 13, 2012

From Google documents to Google Drive

  • Download Google drive and enjoy lot more features.......Google apps has been advanced very well in recent days........its good to be a Google fan
  • Html view of the files uploaded,the time of updating the multiple files and the privacy and security....still lot more we have to identify.....main advantage is it is integrated with our gmail thats enough to highly depend on it....


Wednesday, August 8, 2012

NASA's newly landed Mars science rover Curiosity snapped the first color image of its surroundings while an orbiting sister probe photographed litter left behind during the rover's daring do-or-die descent to the surface, scientists said Tuesday.

Curiosity's color image, taken with a dust cover still on the camera lens, shows the north wall and rim of Gale Crater, a vast basin where the nuclear-powered, six-wheeled rover touched down Sunday night after flying through space for more than eight months.
The picture proved that one of the rover's key instruments, a camera known as the Mars Hand Lens Imager, or MAHLI, was in good working order affixed to the end of Curiosity robot arm.
Designed to take magnified, close-up images of rocks and other objects, or wide shots of landscapes, the camera currently remains stowed on the rover's deck. But once in full operation, scientists can use it to capture fine details with a resolution as high as 13.9 microns per pixel -- several times finer than the width of a human hair.
"It works. It's awesome. Can't wait to open it and see what else we can see," Curiosity scientist Ken Edgett told reporters on Tuesday.
The latest images were relayed to Earth during the rover's first full day on the Red Planet, following a descent through the Martian atmosphere and touchdown on Sunday night that NASA hailed as the most elaborate and challenging ever in robotic spaceflight.
The $2.5 billion project is NASA's first astrobiology mission since the Viking probes of the 1970s, and the landing came as a much-welcome success for a space agency beleaguered by science budget cuts and the recent cancellation of its 30-year-old space shuttle program.
The primary mission of Curiosity, touted as first fully equipped mobile laboratory ever sent to another world, is to search for evidence that the planet most similar to Earth now harbors, or once hosted, the key ingredients necessary for the evolution of microbial life.
But mission controllers at the Jet Propulsion Laboratory in California plan to put the rover and its instruments through several weeks of thorough checks and trial operations before gradually beginning science exploration in earnest.
They want to be sure the car-sized vehicle and its sensitive components came through the tricky, jarring final leg of Curiosity's 352 million-mile (566 million-km) journey to Mars without damage.
Encased in a protective capsule, the rover blasted into the Martian sky at 17 times the speed of sound and slowed itself using friction from steering through the thin atmosphere.
Closer to the ground, the vessel was slowed further by a giant, supersonic parachute before a jet backpack and flying "sky crane" took over to deliver Curiosity the last mile to the surface at 10:32 p.m. PDT on Sunday (1:32 a.m. EDT on Monday/0532 GMT on Monday).
A day later, NASA's sharp-eyed Mars Reconnaissance Orbiter surveyed the scene from a vantage point 186 miles above the planet and found Curiosity's approach to Gale Crater littered with discarded equipment used to position the rover near a towering mountain rising from the crater floor.
"You can see all the components of the entry, descent and landing system," said camera scientist Sarah Milkovich.
The satellite's "crime scene" image, released Tuesday, lays out the trail of debris beginning about 1,312 yards from Curiosity's landing site. That is where the heat shield came to rest it was jettisoned during descent.
The back shell of the capsule, which contained the parachute, ended up about 673 yards away from the rover. The last part of the elaborate landing system, the rocket-powered "sky crane" crash-landed 711 yards away after lowering Curiosity to the ground on a tether.
Mars Reconnaissance Orbiter's image shows the heat shield in a region dotted with small craters, while Curiosity is surrounded by rounded hills and fewer craters. To the north is a third type of terrain riddled with buttes, mesas and pits.
"If it were up to me I would go to where those three come together, so we could start to get the flavor of what's going on here in terms of the different geologic materials," Edgett said.
Scientists expect it will be weeks until Curiosity begins roving and months before it heads to the 3-mile (5-km) high mountain at the center of the crater, the primary target for the two-year science mission.
Scientists believe the mound, known as Mount Sharp, may have formed from the remains of sediment that once completely filled the basin, offering a potentially valuable geologic record of the history of Mars.


Friday, July 20, 2012


The Concept 6 design team was determined not to let the engine blow out to the party-pooping width of the CBX's imposing donk - and it looks like they've done a good job keeping it acceptably narrow. Each cylinder is still slightly oversquare (its bore is slightly larger than its stroke), which will help it spin up and develop horsepower at higher revs, but the stroke is relatively long compared to the ratios used in BMW's inline fours, keeping those cylinder bores as narrow as possible while retaining the ability to rev.
There's very little space in between cylinders, and the alternator and other electrics have been relocated from the side of the engine back behind the crankshaft in the spot above the transmission. The overall result is a motor that BMW claims is four whole inches narrower than the previous thinnest inline six on the market - and only slightly wider than a big inline four.
With a capacity of 1600cc, and all the extra exhaust headers and gear required by an inline six, it's still going to be a very heavy powerplant, but BMW have used a trick from their K-series sportsbikes to neutralize the negative effects that big lump of metal could have on the bike's handling. With the engine tilted forward by 55 degrees, the main bulk of the cylinder bank is kept low, pushing the centre of gravity down and forward, which should help keep the bike flickable and fun in the twisties.
Peak output will reportedly be similar to the K1300 series engines - somewhere around 170 horsepower - but the big six will belt out a massive 130 Nm of torque from just 2000rpm. For reference, the torque monster Suzuki GSX1400 peaks at about 125 Nm at around 4700rpm. The new engine's torque peak is unspecified, but it should rev as high as 9000rpm, making it a hugely flexible powerplant that BMW believes will be "the ideal power unit for a range of different motorcycles." Yummy!


Thursday, July 5, 2012

HIGGS BOSAN---Proton sub particles

Great invention for the decade which gonna change the future......Higgs Bosan was discovered after blasting the protons......many scientists strived so hard for 50 years for this....including our indians....Named as Higgs BOSan after the Indian scientist BOSE

Friday, March 2, 2012

Overview of WINDOWS 8

If you have been following all the good press around Windows 8 and are waiting to try it on your own computer, here’s the good news. The consumer preview version of Windows 8 (just a fancy name for beta software) is now available for download and it is very likely that your existing system specs are good enough to run Windows 8.

The System Requirements for Windows 8

According to the Windows 8 FAQ, any machine equipped with 1 GB of RAM, 16 GB of hard disk space and 1 GHz processor should be able to handle Windows 8. The minimum RAM requirements are 2 GB in case you would like to install the 64-bit version of Windows 8.

Should you download Windows 8 Setup or the ISO Image?

As you may have noticed on the Windows 8 download page, the installation of Windows 8 can be done in two ways.

You can either take the easiest route and download the Windows 8 Setup program – that’s also the default option.
Alternatively, you can download ISO Images of Windows 8.
If you are planning to install Windows 8 on your existing computer, either on a different partition (dual-boot) or just want to upgrade an older version of Windows to Windows 8, the default Setup program is a good choice.

Please note that that your installed software program will only be preserved if you are upgrading from Windows 7 to Windows 8. If your planning to install Windows 8 on top of Windows XP or Vista, only the files will be preserved but not the various software programs that you may have on the disk.

The ISO image may be more handy in other situations like:

Your computer has an x64 processor but is running the 32-bit version of Windows. If you want to install the 64-bit version of Windows 8, download the 64-bit ISO.
You have an iMac or MacBook and want to install Windows 8 on the Mac using Boot Camp software.
You want to install Windows 8 on multiple computers. Download the ISO, create a bootable DVD and boot the other system using this Windows 8 disk.
You want to run Windows 8 as a Virtual Machine inside your existing copy of Windows.
You are running Windows XP.
The universal product key for Windows 8 is NF32V-Q9P3W-7DR7Y-JGWRW-JFCK8.

Will my software programs run inside Windows 8?

Before grabbing the ISO image of Windows 8, quickly run this setup utility and it will show a list of all software programs and hardware drivers on your system that are compatible with Windows 8. Else you can visit this page to see a list of all known software that are found to be working with Windows 7.

What route should you take?

You can have Windows 8 on your computer in three ways – you can install Windows 8 side-by-side (also known as dual-boot), as a virtual machine (so that it runs inside your existing Windows just like any other software) or Windows 8 can be your main OS (there’s no going back then).

If you just want to try out Windows 8 but without disturbing any of your existing set-up, the safest bet is to use a Virtual Machine. If you have a vacant partition or don’t mind creating one (it’s easy), go for the dual-boot option. Else, if you have a spare computer, you can consider upgrading to Windows 8 overwriting the previous installation of Windows. Good luck!

Saturday, January 7, 2012

Hybrid Electric Vehicle

Hybrid Electric Vehicle

Sunday, December 18, 2011

Maintain your dairy in better way

Do you keep a diary (or the more manly version…a journal or chronicles)? Or do you wish to keep notes where the notes keep the date automatically? Well here is an awesome trick for you. All you need is notepad.
1) Open a blank notepad file.
2) Type .LOG in all caps at the top and hit enter.
3) Now save the file.
4) After closing the file, reopen it and notice that the date & time is now listed on the second line.
5) Also notice that the cursor is ready for you to start typing on the very next line.
6) Now every time you open it, type something on the next line and then save it, when you reopen it, it will have automatically saved the date and time on the last line. - Cheap Diary
It keeps a running record of date and time for each save. Now you have a cheap diary! Congrats!
Commit a random act of awesomeness.


Sunday, December 4, 2011


The fastest and most expensive production car ever

When you're ripping along at 253 mph, your mind is not drifting aimlessly. Your senses are cranked up to full volume to detect any hint of impending catastrophe in the maelstrom of wind rush, tire thrum, mechanical thrash, and exhaust roar that surrounds you.
Is that slight shift in the whistling wind caused by a body panel coming loose? Does that vague vibration signal a tire starting to delaminate? Does that subtle new mechanical whine presage a failing bearing that's about to lock up the powertrain?
No such problem developed on the Bugatti Veyron 16.4, because it is not a half-baked aftermarket or boutique road burner. It is a production car developed and tested to the standards of Volkswagen, Bugatti's parent company. With a top speed of 253 mph, it is also the fastest production car ever built.
Production, of course, is a relative term. In the case of the Veyron, Bugatti plans to build only about 50 cars a year at a price of 1 million, which is about $1,250,000 as this is written. To this rarefied market Bugatti has brought an unusual level of sophistication and engineering necessitated by the promise of 1001 metric horsepower (or 987 American horses) and a top speed of 252 mph, a pledge from former VW boss Ferdinand Piƫch when he unveiled the production-intent Veyron at the 2001 Geneva auto show.
Achieving 1000 horsepower in a racing engine is one thing, but to do so in a reliable, refined, durable, and emissions-legal configuration is much harder. The energizer in the Veyron is a WR16 displacing 7998cc and turbocharged with 15.8 psi of boost. You can think of it as two Passat WR8 engines put together and pumped up by four turbos.
But the Bugatti engine has more cylinders, more displacement, more power per liter, and more output overall than any other engine in the WR family tree. When I ask Bugatti development boss Wolfgang Schreiber to explain how the same engine can be rated at 1001 SAE net horsepower at 6000 rpm for the U.S. but only 987 horsepower (1001 PS) for Europe, he laughs, saying, "The production engines are all putting out between 1020 and 1040 PS, enough to cover both promises."
The engine's torque peak is equally mighty at 922 pound-feet, developed between 2200 and 5500 rpm. The four small turbos minimize throttle lag, and the 9.3:1 compression ratio ensures reasonable torque even before boost develops.
All that twist required a dedicated transmission. The Veyron gets a King Kong seven-speed version of VW's twin-clutch gearbox, called DSG. Like the DSG available in the Audi TT, it operates with an automatic mode or a full manual mode via paddle shifters. Because gearchanges occur with one clutch disengaging as the other engages, shifts are uniformly smooth and swift.
With about as much engine output as two Corvette Z06 V-8s, it's no surprise that Bugatti engineers decided to go with all-wheel drive. We don't have many details about the driveline, but the front-to-rear torque split is automatically adjusted to suit dynamic conditions and can range from 100 to 0 percent at either end.
An engine, particularly a turbocharged one, that develops four-digit power throws off more heat than a dozen pizza ovens. Consequently, in the nose of the Veyron are three coolant radiators, one heat exchanger for the twin air-to-liquid intercoolers, and two air-conditioning condensers. There are also transmission and differential oil coolers on the right side and a large engine-oil cooler in the left-side air intake. To help heat escape from the engine compartment, the big WR16 sits in the open, enclosed by no cover of any kind. This powertrain propels the 4300-pound Veyron as effortlessly and gracefully as Tiger Woods belts a 300-yard drive.
My experience with the car took place at Ehra-Lessien in Germany, Volkswagen's test track and high-speed theme park not far from VW headquarters in Wolfsburg. At least it will soon become a theme park because Bugatti plans to let Veyron owners bring their cars to this 13.0-mile circuit to explore the top speed of their cars. In addition to finding out how fast the Veyron can go, I was a guinea pig for this ultimate high-speed thrill ride.
We started with two familiarization laps to get a feel for the track and the car. The track is simple, with a pair of high-banked, 150-mph corners connected by two five-mile-long straights,one of which has a slight bend so that it touches a common parking area.
With the Veyron's high beltline, I couldn't see any of the front bodywork from the driver's seat, but the view of the pavement immediately in front of the car is excellent. The driving position is comfortable, with a snug sport seat that provides great lateral support and manual fore-and-aft and seatback-angle adjustments (a plusher power seat will be optional).
Even after it was lowered to my preferred position, the steering wheel did not obstruct my view of the instrument cluster. And despite the Veyron's low, 47.5-inch height, there was plenty of clearance between my helmeted head and the headliner. Schreiber promises the car will accommodate drivers as tall as six foot seven.
Although the Veyron idles with a quiet murmur, as soon as it starts rolling you hear a symphony of mechanical music that gives way to tire thrum when you get above 100 mph, which doesn't take long. We had no opportunity to perform acceleration testing, but the ease with which the Bugatti blows past that speed is astonishing. We predict about six seconds flat from a dead stop.
What's more, the acceleration doesn't slacken when you hit triple-digit speeds. In my first lap, I took the car up to about 185 mph, at which point the tire noise was fairly loud but the Veyron was otherwise calm and relaxed. One reason it felt so secure is that when you hit 137 mph, the Bugatti hunkers down, lowering its normal ride height of 4.9 inches to 3.1 in front and 3.7 in the rear. At the same time a small spoiler deploys from the rear bodywork and a wing extends about a foot, perched at a six-degree angle. Two underbody flaps ahead of the front tires also open up. This configuration produces substantial downforce, about 330 pounds in front and 440 in the rear at 230 mph.
Given that it only takes about 500 horsepower to overcome the prevailing drag at 185 mph, that leaves the 500 horses remaining for acceleration duty. So when you plant your right foot at 185, the Veyron's surge of power shoves you into the driver's seat about as hard as a Corvette's does at 100 mph, or a Ford Five Hundred's does at 40 mph. Accelerating from 185 to 230 on my next lap didn't take very long, and the car remained glued to the pavement, although wind roar overcame tire thrumming to become the predominant sound.
But 230 mph is about as fast as the Veyron will go until you put the car into top-speed mode. This involves coming to a stop and, while the car is idling, turning a key in a lock on the floor to the left of the driver's seat. When you do that, the car sinks down even lower on its suspension, until ground clearance has been reduced to a mere 2.6 inches in front and 2.8 in the rear. This setup also causes the front underbody flaps to close and the rear spoiler and wing to retract, although the wing remains tilted out of the body at a slight two-degree angle. These changes reduce the car's drag coefficient from 0.41 to 0.36, and they reduce the peak downforce from 770 to 120 pounds.


Thursday, December 1, 2011



      Google has designed Chrome OS keeping the Web in mind and most of its functionalities will be available only if the Chromebook is connected to the internet. Users apps, games, photos, music, movies and documents will all be on cloud.

The bare-bones operating system is essentially a Web browser that will guide users to applications like email and spreadsheets directly on the Web, instead of storing software such as, say, Outlook or Word on PCs.

Moving day-to-day functions onto the Internet removes the burden of time-consuming tasks associated with traditional PCs, like installing software and updates, backing up files and running antivirus checks.

As also mentioned before, everything is on the cloud. Hence the laptops will be tightly integrated with Google's "cloud" online services, and will have almost no capacity to store information. Though they will have slots to plug in other storages device users buy separately.

As with the company's mobile OS Android, Chrome software will be free.

Samsung Electronics Co and Acer Inc made the first Chromebooks.Both models will have keyboards, but no hard drives for storage. The machines will be like computer terminals dependent on a connection to the Internet. The laptops come with 16 gigabytes of flash memory -- the kind found in smartphones, tablet computers and some iPods. Google Chromebooks will run on Intel Corp's Atom chip.

While Google has diligently worked to make sure Chromebooks can be used offline, the computing model ultimately relies on being connected to the Internet.

The company claims that Chromebooks will be up and running in about eight seconds. Every time a user will turn it on, the software will check online to see if there are updates, and it will always boot up with the latest version.

In case, there's a failure, for whatever reason, the OS will simply reinstall itself.

Google Chromebook comes with security features such as secure tabbed browsing (called sandboxing), data encryption, and verified boot. According to Google, "Chromebooks have many layers of security built in so there is no anti-virus software to buy and maintain


GE to Open New Global Software Headquarters in Bay Area, Hire 400 Software Engineers


As the Internet evolved from the dial-up days of America Online to the always-on, cloud-dwelling social network, a parallel development has been taking place in the background: the digital web of the world’s trillions of machines.
Over the last several decades, GE’s software engineers have guided the growth of this emerging industrial Internet. Putting their brains and manufacturing skills to the task, they connected jet engines, power transformers, and medical devices to boost the efficiency of these complex systems and save customers money. With some 5,000 software engineers on staff, GE’s software revenues are about $2.5 billion and the company expects double-digit growth from now until 2015.
Today, GE announced what would be a new dynamo powering this growth: a new Global Software Center, located in San Ramon, California. The center will hire and house 400 software engineers and other professionals developing digital tools that gather and analyze the millions of gigabytes of data generated by controls, sensors, computers and other parts of the brains of industrial machines. These tools will predict and respond to changes, and guide customers in how to best use their assets.
It’s the kind of work that went into GE’s rail Movement Planner and Trip Optimizer. The program gets locomotives to talk to each other, loop in traffic control systems, freight loaders, and technicians with their smartphones. This is no idle talk: a railroad can increase speeds up to 20%, cut fuel consumption by 10%, and save as much as $200 million in capital and expenses annually.
The San Ramon facility will be GE’s “nerve center for software” and link to other GE businesses and software engineers. Mark Little, GE’s Chief Technology Officer, says that the center will promote collaboration across GE and its diverse group customers. “On any given day, one of our software experts could be working on a clean energy project, while at the same time contributing to a program that improves the delivery of health care,” says Little.


Wednesday, November 30, 2011

Nuclear weapons

Nuclear weapons
A few words about nuclear weapons technology..

Fission weapons

Nuclear weapons exploit two principle physical, or more specifically nuclear, properties of certain substances: fission and fusion.
Fission is possible in a number of heavy elements, but in weapons it is principally confined to what is termed slow neutron fission in just two particular isotopes:235U and 239Pu. These are termed fissile, and are the source of energy in atomic weapons. An explosive chain reaction can be started with relatively slight energy input (so-called slow neutrons) in such material.
Pu239Ga.jpg (6513 bytes)
An actual 239Pu ingot, alloyed with gallium for improved physical properties
Isotopes are 'varieties' of an element which differ only in their number of neutrons. For example, hydrogen exists as 12H and 3H -- different isotopes of the same chemical element, with no, one, and two neutrons respectively. All the chemical properties, and most of the physical properties, are the same between isotopes. Nuclear properties may differ significantly, however.
The fission, or 'splitting' of an atom, releases a very large amount of energy per unit volume -- but a single atom is very small indeed. The key to an uncontrolled or explosive release of this energy in a mass of fissile material large enough to constitute a weapon is the establishment of a chain reaction with a short time period and high growth rate. This is surprisingly easy to do.
Fission of 235U (uranium) or 239Pu (plutonium) starts in most weapons with an incident source of neutrons. These strike atoms of the fissile material, which (in most cases) fissions, and each atom in so doing releases, on average, somewhat more than 2 neutrons. These then strike other atoms in the mass of material, and so on.
If the mass is too small, or has too large a surface area, too many neutrons escape and a chain reaction is not possible; such a mass is termed subcritical. If the neutrons generated exactly equal the number consumed in subsequent fissions, the mass is said to be critical. If the mass is in excess of this, it is termed supercritical.
Fission (atomic) weapons are simply based on assembling a supercritical mass of fissile material quickly enough to counter disassembly forces.
The majority of the energy release is nearly instantaneous, the mean time from neutron release to fission can be of the order of 10 nanoseconds, and the chain reaction builds exponentially. The result is that greater than 99% of the very considerable energy released in an atomic explosion is generated in the last few (typically 4-5) generations of fission --  less than a tenth of a microsecond.*
This tremendous energy release in a small space over fantastically short periods of time creates some unusual phenomena -- physical conditions that have no equal on earth, no matter how much TNT is stacked up.
Plutonium (239Pu) is the principal fissile material used in today's nuclear weapons. The actual amount of this fissile material required for a nuclear weapon is shockingly small. 
Below is a scale model of the amount of 239Pu required in a weapon with the force that destroyed the city of Nagasaki in 1945:
Pu32inch.jpg (4643 bytes)
In the Fat Man (Nagasaki) weapon design an excess of Pu was provided. Most of the remaining bulk of the weapon was comprised of two concentric shells of high explosives. Each of these was carefully fashioned from two types of explosives with differing burn rates. These, when detonated symmetrically on the outermost layer, caused an implosion or inward-moving explosion.
The two explosive types were shaped to create a roughly spherical convergent shockwave which, when it reached the Pu 'pit' in the center of the device, caused it to collapse. 
The Pu pit became denser, underwent a phase change, and became supercritical. 
A small neutron source, the initiator, placed in the very center of this Pu pit, provided an initial burst of neutrons --  final generations of which, less than a microsecond later, saw the destruction of an entire city and more than 30,000 people..
Nearly all the design information for weapons such as these is now in the public domain; in fact, considering the fact that fission weapons exploit such a simple and fundamental physical (nuclear) property, it is no surprise that this is so. It is more surprising that so much stayed secret for so long, at least from the general public. 
A neutron reflector, often made of beryllium, is placed outside the central pit to reflect neutrons back into the pit. A tamper, often made of depleted uranium or238U helps control premature disassembly. Modern fission devices use a technique called 'boosting' (referred to in the next section), to control and enhance the yield of the device.
Today's nuclear threat lies mostly in preventing this fissile special nuclear material (often referred to as SNM) from falling into the wrong hands: once there, it is a very short step to construct a working weapon.
What we do now to keep these devices out of the hands of groups like Al-Qaeda is vital to civilized peoples.
abomb.jpg (13766 bytes)A schematic of a hypothetical 'boosted' fission weapon
(showing unnecessary 235U)

trinity.jpg (22048 bytes)
The gadget device used in the Trinity test: the world's first nuclear weapon test. 
Note spherical geometry and the HE detonator arrangement. New Mexico, 21KT, 1945.
grable.jpg (16126 bytes)
Typical fission weapon, shortly after detonation at the Nevada test site, with roughly the same yield as the weapon that destroyed Hiroshima. Reddish vapor surrounding the plasma toroid includes intensely radioactive fission fragments and ionized nitrogen oxides from the atmosphere. (Grable, 15KT, 1953)

Fusion weapons

Fission weapons discussed above are ultimately limited in their destructive capability by the sheer size a subcritical mass can assume -- and be imploded quickly enough by high explosives to form a supercritical assembly. The largest known pure fission weapon tested had a 500 kiloton yield. This is some thirty-eight times the release which destroyed Hiroshima in 1945. Not satisfied that this was powerful enough, designers developed thermonuclear (fusion) weapons.
Fusion exploits the energy released in the fusing of two atoms to form a new element; e.g. deuterium atoms fusing to form helium, 2H + 2H = 4He2 , as occurs on the sun. For atoms to fuse, very high temperatures and pressures are required. Only fusion of the lightest element, hydrogen, has proven practical. And only the heavy isotopes of hydrogen, 2H (deuterium) and 3H (tritium), have a low enough threshold for fusion to have been used in weapons successfully thus far.
The first method tried (boosting) involved simply placing 3H in a void within the center of a fission weapon, where tremendous temperatures and high pressures were attendant to the fission explosion. This worked; contributing energy to the overall explosion, and boosting the efficiency of the Pu fissioning as well (fusion reactions also release neutrons, but with much higher energy). 
Because 3H is a gas at room temperature, it can be easily 'bled' into the central cavity from a storage bottle prior to an explosion, and impact the final yield of the device. This is still used today, and allows for what is termed 'dial-a-yield' capability on many stockpiled weapons.
Multistage thermonuclear weapons -- the main component of today's strategic nuclear forces -- are more complex. These employ a 'primary' fission weapon to serve merely as a trigger. As mentioned above, the fission weapon is characterized by a tremendous energy release in a small space over a short period of time. As a result, a very large fraction of the initial energy release is in the form of thermal X-rays. 
These X-rays are channeled to a 'secondary' fusion package. The X-rays travel into a cavity within a b28.jpg (8660 bytes)cylindrical radiation container. 
The radiation pressure from these X-rays either directly, or through an intermediate material often cited as a polystyrene foam, ablates a cylindrical enclosure containing thermonuclear fuel (shown in blue at left); this can be Li2H (lithium deuteride). 
Running along the central axis of this fuel is a rod of fissile material, termed a 'sparkplug'. 
The contracting fuel package becomes denser, the sparkplug begins to fission, neutrons from this transmute the Li2H into 3H that can readily fuse with 2H (the fusion reaction 3H + 2H has a very high cross-section, or probability, in typical secondary designs), heat increases greatly, and fusion continues through the fuel mass. 
A final 'tertiary' stage can be added to this in the form of an exterior blanket of 238U, wrapping the outer surface of the radiation case or the fuel package. 238U is not fissionable by the slower neutrons which dominate the fission weapon environment, but fusion releases copious high energy neutrons and this can fast fission the ordinary uranium. 
This is a cheap (and radiologically very dirty) way to greatly increase yield. The largest weapon ever detonated -- the Soviet Union's 'super bomb', was some 60 MT in yield, and would have been nearer 100MT had this technique been used in its tertiary. Again, to control the yield precisely, 3H may be bled from a separate tank into the core of the primary, as shown in the hypothetical diagram on the left of a modern thermonuclear weapon. 
This primary/secondary/tertiary or multistage arrangement can be increased -- unlike the fission weapon -- to provide insane governments with any arbitrarily large yield.
bravo.jpg (19917 bytes)
Rare photo of the actual shrimp device used in Castle Bravo. Note the cylindrical geometry, and the emergent spherical fission trigger on the right. Light pipes leading to ceiling are visible near the fission trigger and at two points along the secondary for transmitting early diagnostic information to remote collection points, before they themselves are destroyed. 

Note the 'danger, no smoking' sign at lower left. 15MT, 1954.

Fusion, or thermonuclear weapons, are not simple to design nor are they likely targets of construction for would-be terrorists today. 
Many aspects of the relevant radiation transport, X-ray opacities, and ultra-high T and D equations-of-state (EOS) for relevant materials are still classified to this day (though increasing dissemination of weapons-adaptable information from the inertially-confined fusion (ICF) area may change this in time). Keeping such information classified makes good sense.
romeo.jpg (9298 bytes)
Typical appearance of a thermonuclear weapon detonation -- from many miles away. 
(Castle Romeo, 7MT, 1954)

*Special techniques were required to record the fleeting moments of a weapon's initial detonation. One such method was theRapatronic camera, developed by Dr. Harold Edgerton. The images it created are bizarre. Check out our collection of Rapatronic photographs.


Monday, November 28, 2011

Best Hybrid Electric Vehicle - CHEVROLET VOLT

First And The Best Hybrid Electric Vehicle- CHEVROLET VOLT

The Chevrolet Volt is a plug-in hybrid electric vehicle manufactured by General Motors. The Volt has been on sale in the U.S. market since mid-December 2010, and is the most fuel-efficientcompact car sold in the United States, as rated by the United States Environmental Protection Agency (EPA).
According to General Motors the Volt can travel 25 to 50 miles (40 to 80 km) on its lithium-ion battery alone. The EPA official all-electric range is 35 miles (56 km), and the total range is 379 miles (610 km). The EPA rated 2011 model year Volt's combined city/highway fuel economy at 93 mpg-US (2.5 L/100 km; 112 mpg-impequivalent (MPG-e) in all-electric mode, and at 37 mpg-US (6.4 L/100 km; 44 mpg-imp) in gasoline-only mode, for an overall combined gasoline-electric fuel economy rating of 60 mpg-US (3.9 L/100 km; 72 mpg-imp) equivalent. The 2012 model year Volt received a revised EPA rating, increasing the combined city/highway fuel economy in all-electric mode to 94 MPG-e. The Volt operates as a pure battery electric vehicle until its plug-in battery capacity is depleted; at which point its gasoline engine powers an electric generator to extend the vehicle's range. The Volt's regenerative braking also contributes to the on-board electricity generation. In order to improve performance, the internal combustion engine may at times be engaged mechanically to assist both electric motors to propel the Volt.
The suggested retail price for the 2011 Chevrolet Volt in the U.S. starts at US$40,280 before the US$7,500 U.S. federal tax credit and additional incentives are available in some locations. The 2012 Volt base price will be US$1,005 less than the 2011 model, as the base configuration has been defeatured. The 2011 Volt is being sold only in selected U.S. markets and nationwide availability of the 2012 model year is expected by November 2011. Deliveries of the 2012 Volt began in Canada in September 2011, and the suggested retail price starts at CAD 41,545(US$43,568) before any available rebates. The initial Canadian launch is also limited to selected markets and availability in the rest of Canada is expected before the end of 2012. In China the Volt's price starts at RMB 498,000 (around US$75,500) and initial sales will be limited to eight cities. In the Eurozone the Volt platform will be sold as the Opel/Vauxhall Ampera, and is expected to be sold for about €42,900 (US$58,000) including VAT before any government incentives. In the United Kingdom the Vauxhall Ampera is expected to be priced at GB£33,995. Exports to Europe and China are scheduled for late 2011.
Among other awards and recognition, the Chevrolet Volt won the 2009 Green Car Vision Award, 2011 Motor Trend Car of the Year, 2011 Green Car of the Year, 2011 North American Car of the Year, and 2011 World Green Car. Despite the awards earned and the positive reception from many automotive critics, there has been some controversy and concerns in the media. These include the extent of the federal government participation in the Volt development during General Motors' 2009 government-led bankruptcy; concerns about the Volt's relatively high sales price; and complaints about price markups due to the Volt's initial limited supply.