New submitter multimediavt writes "Ok, here's my problem. I have a lot of personal data! (And, no, it's not pr0n, warez, or anything the MPAA or RIAA would be concerned about.) I am realizing that I need to keep at least one spare drive the same size as my largest drive around in case of failure, or the need to reformat a drive due to corrupt file system issues. In my particular case I have a few external drives ranging in size from 200 GB to 2 TB (none with any more than 15 available), and the 2 TB drive is giving me fits at the moment so I need to move the data off and reformat the drive to see if it's just a file system issue or a component issue. I don't have 1.6 TB of free space anywhere and came to the above realization that an empty spare drive the size of my largest drive was needed. If I had a RAID I would have the same needs should a drive fail for some reason and the file system needed rebuilding. I am hitting a wall, and I am guessing that I am not the only one reaching this conclusion. This is my personal data and it is starting to become unbelievably unruly to deal with as far as data integrity and security are concerned. This problem is only going to get worse, and I'm sorry 'The Cloud' is not an acceptable nor practical solution. Tape for an individual as a backup mechanism is economically not feasible. Blu-ray Disc only holds 50 GB at best case and takes forever to backup any large amount of data, along with a great deal of human intervention in the process. So, as an individual with a large data collection and not a large budget, what do you see as options for now (other than keeping a spare blank drive around), and what do you see down the road that might help us deal with issues like this?"
Catch up on stories from the past week (and beyond) at the Slashdot story archive
Bananatree3 writes "While we have sci-fi visions of room temperature superconductors like in the movie Avatar, the question still remains: How would the discovery of a such a material impact our everyday lives? How would the nature of warfare change? How would the global economy react? What are the cultural pros and cons of such a technological shift?" And just as important, in what contexts would you want to see it first employed?
ananyo writes "Laser beams at the National Ignition Facility have fired a record 1.875 megajoule shot into its target chamber, surpassing their design specification. The achievement is a milepost on the way to ignition — the 'break-even' point at which the facility will finally be able to release more energy than goes into the laser shot by imploding a target pellet of hydrogen isotopes. NIF's managers think the end of their two-year campaign for break-even energy is in sight and say they should achieve ignition before the end of 2012. However, with scientists at NIF saying that a $4 billion pilot plant could be putting hundreds of megawatts into the grid by the early 2020s, some question whether the Department of Energy is backing the wrong horse with ITER — a $21-billion international fusion experiment under construction at St-Paul-lez-Durance, France. Is it time for the DoE to switch priorities and back NIF's proposals?" Perhaps a better idea, given the potential benefits of fusion research, would be for the DoE to throw their weight behind multiple projects, rather than sacrificing some to support others.
New submitter unimacs writes "So Apple has been under fire recently for the conditions at the factories of their Chinese suppliers. I listened to 'This American Life's' recent retraction of the Michael Daisey piece they did a while back. Great radio for those of you who haven't heard it — rarely has dead air been used to such effect. Anyway, while his work has been discredited, Michael Daisey wasn't inaccurate in his claims that working conditions are poor in iPhone and iPad factories. Given that, are there any smart phone manufacturers whose phones are made under better conditions?"
ichard writes "In a couple of months I'm going to start working from home full-time. I've been thinking about the obvious things like workspace ergonomics, but I'm sure there are more subtle considerations involved in a zero-minute commute. What are other Slashdot readers' experiences and recommendations for working from home? How do you stay focused and motivated?"
An anonymous reader writes "Contrary to what many individuals think, not everybody on Slashdot went to college for a computer-related degree. Graduating in May of this year, my undergraduate degree will be in psychology. Like many undergraduate psychology students, I applied to a multitude of graduate programs but, unfortunately, was not given admission into a single one. Many are aware that a bachelor's degree in psychology is quite limiting, so I undoubtedly have been forced into a complicated situation. Despite my degree being in psychology, I have an immense interest in computers and the typical 'hard science' fields. How can one with a degree that is not related to computers acquire a job that is centered around computers? At the moment, I am self-taught and can easily keep up in a conversation of computer science majors. I also do a decent amount of programming in C, Perl, and Python and have contributed to small open source projects. Would Slashdot users recommend receiving a formal computer science education (only about two years, since the nonsensical general education requirements are already completed) before attempting to get such a job? Anybody else in a similar situation?"
jm223 writes "I'm currently a student at a major university, where I do IT work for a fairly large student group. Most of my job involves programming, and so far everyone has been happy with my work. Since we're students, though, no one really has the experience to offer major advice or critiques, and I'm curious about how my coding measures up — and, of course, how I can make it better. CS professors can offer feedback about class projects, but my schoolwork often bears little resemblance to my other work. So, when you're programming without an experienced manager above you, how do you go about improving?"
Dmitri Baughman writes "I'm the IT guy at a small software development company of about 100 employees. Everyone is technically inclined, with disciplines in development, QA, and PM areas. As part of a monthly knowledge-sharing meeting, I've been asked to give a 30-minute presentation about our computing and networking infrastructure. I manage a pretty typical environment, so I'm not sure how to present the information in a fun and engaging way. I think network diagrams and bandwidth usage charts would make anyone's eyes glaze over! Any ideas for holding everyone's interest?"
New submitter Manzanita writes "The domain of personal analytics, or 'Quantified Self,' is rich with interesting things to measure and many hackers have started projects. But they will only take off if it is sufficiently easy to gather and use the data. Stephen Wolfram has collected and analyzed a lot of his personal data over the last 20 years, but that is far beyond what most of us have the time for. What do you find worth tracking? What is ripe for developing into a business?"
New submitter derchris writes "We will be on vacation in the U.S. next month for about 3 weeks. We are going to do a road trip between San Francisco, Las Vegas and Los Angeles. To not use roaming for data, and get a heart attack once back home looking at the mobile bill, I was looking at so called 'MiFi' devices, portable 3G Wi-Fi hotspots. As far as I know, more or less all of the U.S. carriers have such devices available. But as I'm not from the U.S., I have no idea what would give me the best 3G coverage in the areas we are travelling. Another question would be whether I can buy one of these devices off eBay, and use it with any SIM card. Let's hope there are users available who could give some advice on this topic."
New submitter es330td writes "I'd like to write a program that takes the old cannon game to another level, but instead of the path being a simple parabolic arc, the projectile will move through a field of objects exerting gravitational attraction (or repulsion) and the player will have to adjust velocity and angle to find the path through the space between launch point and the target.In an ideal world, this would end up as one of these Flash based web playable games, as that would force me to fully flesh it out, debug and complete the app. I doubt this will ever be commercial, so hiring somebody doesn't make sense, and I wouldn't learn anything that way either. I have been programming for almost 20 years, but the bulk of my work has been in corporate programming, primarily web (Cold Fusion, ASP & C#.Net,) or VB6 and then C# Windows GUI interfaces to RDBMS. I have never written a graphics based game, nor have I ever written something using the physics this will require. Once upon a time, I could program in C but I think I would be much better off to work with someone rather than try to roll my own unless good books exist to flatten the learning curve. Any advice on how to proceed?"
jjp9999 writes "I've been looking for some good reading material, and have been delving into the realms of some great, but nearly forgotten authors — finding the likes of Lord Dunsany (The King of Elfland's Daughter) and E.R. Eddison (The Worm Ouroboros). I wanted to ask the community here: do you know of any other great fantasy or science fiction books that time has forgotten?"
holmedog writes "A simple question with a lot of answers (I hope). I recently had issues with my DSL broadband at home, and after a month of no resolution, I was told 300ms latency (to their test servers) was the acceptable range for Centurylink 10.0Mbps. This got a shocked reaction out of me to say the least. I would think anything over 125ms to be in the unacceptable range. So, I have come to you to ask: What do you consider to be acceptable broadband latency and why?"
nirgle writes "I have been wondering lately if there are any kids interested in programming for its own sake anymore. When I was my nephew's age, computers were still fascinating: There wasn't a laptop on every table, facebook wasn't splattered on every screen, and you couldn't get any question answered in just a couple seconds with Google. When I was 10, I would have done anything for a close programming mentor instead of the 5-foot high stack of books that I had to read cover-to-cover on my own. So I was happy when my nephew started asking about learning to do what "Uncle Jay does." Does the responsibility now shift to us to kindle early fires in computer science, or is programming now just another profession for the educational system to manage?" Another reader pointed out a related post on the Invent with Python blog titled "Nobody wants to learn how to program."
Mooga writes "I am a hard-core user of Firefox 3.6.x who has chosen to stick with the older, yet supported version of Firefox for many years now. However, 3.6.x will soon hit end-of-life, making my life, and the lives of similar users, much more complicated. 3.6.x has been known for generally being more stable and using less RAM than the modern Firefox 10 and even Chrome. The older version of Firefox is already having issues rendering modern websites. What are others who have been holding onto 3.6.x planning on doing?"