Graphics

Ask Slashdot: Should CPU, GPU Name-Numbering Indicate Real World Performance? 184

dryriver writes: Anyone who has built a PC in recent years knows how confusing the letters and numbers that trail modern CPU and GPU names can be because they do not necessarily tell you how fast one electronic part is compared to another electronic part. A Zoomdaahl Core C-5 7780 is not necessarily faster than a Boomberg ElectronRipper V-6 6220 -- the number at the end, unlike a GFLOPS or TFLOPS number for example, tells you very little about the real-world performance of the part. It is not easy to create one unified, standardized performance benchmark that could change this. One part may be great for 3D gaming, a competing part may smoke the first part in a database server application, and a third part may compress 4K HEVC video 11% faster. So creating something like, say, a Standardized Real-World Application Performance Score (SRWAPS) and putting that score next to the part name, letters, or series number will probably never happen. A lot of competing companies would have to agree to a particular type of benchmark, make sure all benchmarking is done fairly and accurately, and so on and so forth.

But how are the average consumers just trying to buy the right home laptop or gaming PC for their kids supposed to cope with the "letters and numbers salad" that follows CPU, GPU and other computer part names? If you are computer literate, you can dive right into the different performance benchmarks for a certain part on a typical tech site that benchmarks parts. But what if you are "Computer Buyer Joe" or "Jane Average" and you just want to glean quickly which two products -- two budget priced laptops listed on Amazon.com for example -- have the better performance overall? Is there no way to create some kind of rough numeric indicator of real-world performance and put it into a product's specs for quick comparison?
Programming

Ask Slashdot: Are 'Full Stack' Developers a Thing? 371

"It seems that nearly every job posting for a software developer these days requires someone who can do it all," complains Slashdot reader datavirtue, noting a main focus on finding someone to do "front end work and back end work and database work and message queue work...." I have been in a relatively small shop that for years that has always had a few guys focused on the UI. The rest of us might have to do something on the front-end but are mostly engaged in more complex "back-end" development or MQ and database architecture. I have been keeping my eye on the market, and the laser focus on full stack developers is a real turn-off.

When was the last time you had an outage because the UI didn't work right? I can't count the number of outages resulting from inexperienced developers introducing a bug in the business logic or middle tier. Am I correct in assuming that the shops that are always looking for full stack developers just aren't grown up yet?

sjames (Slashdot reader #1,099) responded that "They are a thing, but in order to have comprehensive experience in everything involved, the developer will almost certainly be older than HR departments in 'the valley' like to hire."

And Dave Ostrander argues that "In the last 10 years front end software development has gotten really complex. Gulp, Grunt, Sass, 35+ different mobile device screen sizes and 15 major browsers to code for, has made the front end skillset very valuable." The original submitter argues that front-end development "is a much simpler domain," leading to its own discussion.

Share your own thoughts in the comments. Are "full-stack" developers a thing?
Privacy

Ask Slashdot: Why Are There No True Dual-System Laptops Or Tablet Computers? 378

dryriver writes: This is not a question about dual-booting OSs -- having 2 or more different OSs installed on the same machine. Rather, imagine that I'm a business person or product engineer or management consultant with a Windows 10 laptop that has confidential client emails, word documents, financial spreadsheets, product CAD files or similar on it. Business stuff that needs to stay confidential per my employment contract or NDAs or any other agreement I may have signed. When I have to access the internet from an untrusted internet access point that somebody else controls -- free WiFi in a restaurant, cafe or airport lounge in a foreign country for example -- I do not want my main Win 10 OS, Intel/AMD laptop hardware or other software exposed to this untrusted internet connection at all. Rather, I want to use a 2nd and completely separate System On Chip or SOC inside my Laptop running Linux or Android to do my internet accessing. In other words, I want to be able to switch to a small 2nd standalone Android/Linux computer inside my Windows 10 laptop, so that I can do my emailing and internet browsing just about anywhere without any worries at all, because in that mode, only the small SOC hardware and its RAM is exposed to the internet, not any of the rest of my laptop or tablet. A hardware switch on the laptop casing would let me turn the 2nd SOC computer on when I need to use it, and it would take over the screen, trackpad and keyboard when used. But the SOC computer would have no physical connection at all to my main OS, BIOS, CPU, RAM, SSD, USB ports and so on. Does something like this exist at all (if so, I've never seen it...)? And if not, isn't this a major oversight? Wouldn't it be worth sticking a 200 Dollar Android or Linux SOC computer into a laptop computer if that enables you access internet anywhere, without any worries that your main OS and hardware can be compromised by 3rd parties while you do this?
Graphics

Ask Slashdot: How Did Real-Time Ray Tracing Become Possible With Today's Technology? 145

dryriver writes: There are occasions where multiple big tech manufacturers all announce the exact same innovation at the same time -- e.g. 4K UHD TVs. Everybody in broadcasting and audiovisual content creation knew that 4K/8K UHD and high dynamic range (HDR) were coming years in advance, and that all the big TV and screen manufacturers were preparing 4K UHD HDR product lines because FHD was beginning to bore consumers. It came as no surprise when everybody had a 4K UHD product announcement and demo ready at the same time. Something very unusual happened this year at GDC 2018 however. Multiple graphics and GPU companies, like Microsoft, Nvidia, and AMD, as well as other game developers and game engine makers, all announced that real-time ray tracing is coming to their mass-market products, and by extension, to computer games, VR content and other realtime 3D applications.

Why is this odd? Because for many years any mention of 30+ FPS real-time ray tracing was thought to be utterly impossible with today's hardware technology. It was deemed far too computationally intensive for today's GPU technology and far too expensive for anything mass market. Gamers weren't screaming for the technology. Technologists didn't think it was doable at this point in time. Raster 3D graphics -- what we have in DirectX, OpenGL and game consoles today -- was very, very profitable and could easily have evolved further the way it has for another 7 to 8 years. And suddenly there it was: everybody announced at the same time that real-time ray tracing is not only technically possible, but also coming to your home gaming PC much sooner than anybody thought. Working tech demos were shown. What happened? How did real-time ray tracing, which only a few 3D graphics nerds and researchers in the field talked about until recently, suddenly become so technically possible, economically feasible, and so guaranteed-to-be-profitable that everybody announced this year that they are doing it?
Open Source

Ask Slashdot: Can FOSS Help In the Fight Against Climate Change? 154

dryriver writes: Before I ask my question, there already is free and open-source software (FOSS) for wind turbine design and simulation called QBlade. It lets you calculate turbine blade performance using nothing more than a computer and appears compatible with Xfoil as well. But consider this: the ultimate, most efficient and most real-world usable and widely deployable wind turbine rotor may not have traditional "blades" or "foils" at all, but may be a non-propeller-like, complex and possibly rather strange looking three-dimensional rotor of the sort that only a 3D printer could prototype easily. It may be on a vertical or horizontal axis. It may have air flowing through canals in its non-traditional structure, rather than just around it. Nobody really knows what this "ultimate wind turbine rotor" may look like.

The easiest way to find such a rotor might be through machine-learning. You get an algorithm to create complex non-traditional 3D rotor shapes, simulate their behavior in wind, and then mutate the design, simulate again, and get a machine learning algorithm to learn what sort of mutations lead to a better performing 3D rotor. In theory, enough iterations -- perhaps millions or more -- should eventually lead to the "ultimate rotor" or something closer to it than what is used in wind turbines today. Is this something FOSS developers could tackle, or is this task too complex for non-commercial software? The real world impact of such a FOSS project could be that far better wind turbines can be designed, manufactured and deployed than currently exist, and the fight against climate change becomes more effective; the better your wind turbines perform, and the more usable they are, the more of a fighting chance humanity has to do something against climate change. Could FOSS achieve this?
Facebook

Ask Slashdot: Is There a Good Alternative to Facebook? (washingtonpost.com) 490

Long-time Slashdot reader Lauren Weinstein argues that fixing Facebook may be impossible because "Facebook's entire ecosystem is predicated on encouraging the manipulation of its users by third parties who posses the skills and financial resources to leverage Facebook's model. These are not aberrations at Facebook -- they are exactly how Facebook was designed to operate." Meanwhile one fund manager is already predicting that sooner or later every social media platform "is going to become MySpace," adding that "Nobody young uses Facebook," and that the backlash over Cambridge Analytica "quickens the demise."

But Slashdot reader silvergeek asks, "is there a safe, secure, and ethical alternative?" to which tepples suggests "the so-called IndieWeb stack using the h-entry microformat." He also suggests Diaspora, with an anonymous Diaspora user adding that "My family uses a server I put up to trade photos and posts... Ultimately more people need to start hosting family servers to help us get off the cloud craze... NethServer is a pretty decent CentOS based option."

Meanwhile Slashdot user Locke2005 shared a Washington Post profile of Mastodon, "a Twitter-like social network that has had a massive spike in sign-ups this week." Mastodon's code is open-source, meaning anybody can inspect its design. It's distributed, meaning that it doesn't run in some data center controlled by corporate executives but instead is run by its own users who set up independent servers. And its development costs are paid for by online donations, rather than through the marketing of users' personal information... Rooted in the idea that it doesn't benefit consumers to depend on centralized commercial platforms sucking up users' personal information, these entrepreneurs believe they can restore a bit of the magic from the Internet's earlier days -- back when everything was open and interoperable, not siloed and commercialized.
The article also interviews the founders of Blockstack, a blockchain-based marketplace for apps where all user data remains local and encrypted. "There's no company in the middle that's hosting all the data," they tell the Post. "We're going back to the world where it's like the old-school Microsoft Word -- where your interactions are yours, they're local and nobody's tracking them." On Medium, Mastodon founder Eugene Rochko also acknowledges Scuttlebutt and Hubzilla, ending his post with a message to all social media users: "To make an impact, we must act."

Lauren Weinstein believes Google has already created an alternative to Facebook's "sick ecosystem": Google Plus. "There are no ads on Google+. Nobody can buy their way into your feed or pay Google for priority. Google doesn't micromanage what you see. Google doesn't sell your personal information to any third parties..." And most importantly, "There's much less of an emphasis on hanging around with those high school nitwits whom you despised anyway, and much more a focus on meeting new persons from around the world for intelligent discussions... G+ posts more typically are about 'us' -- and tend to be far more interesting as a result." (Even Linus Torvalds is already reviewing gadgets there.)

Wired has also compiled their own list of alternatives to every Facebook service. But what are Slashdot's readers doing for their social media fix? Leave your own thoughts and suggestions in the comments.

Is there a good alternative to Facebook?
Sci-Fi

Ask Slashdot: Is Beaming Down In Star Trek a Death Sentence? 593

Artem Tashkinov writes: Some time ago, Ars Technica ran a monumental article on beaming of consciousness in Star Trek and its implications, and more importantly, whether it's plausible to achieve that without killing a person in the process.

It seems possible in the Star Trek universe. However, currently physicists find the idea absurd and unreal because there's no way you can transport matter and its quantum state without first destroying it and then recreating it perfectly, due to Heisenberg's Uncertainty Principle. The biggest conundrum of all is the fact that pretty much everyone understands that consciousness is a physical state of the brain, which features continuity as its primary principle; yet it surely seems like copying the said state produces a new person altogether, which brings up the problem of consciousness becoming local to one's skull and inseparable from gray matter. This idea sounds a bit unscientific because it introduces the notion that there's something about our brain which cannot be described in terms of physics, almost like soul.

This also brings another very difficult question: how do we know if we are the same person when we wake up in the morning or after we were put under during general anesthesia? What are your thoughts on the topic?
Businesses

Ask Slashdot: Were Developments In Technology More Exciting 30 Years Ago? 231

dryriver writes: We live in a time where mainstream media, websites, blogs, social media accounts, your barely computer literate next door neighbor and so forth frequently rave about the "innovation" that is happening everywhere. But as someone who experienced developments in technology back in the 1980s and 1990s, in computing in particular, I cannot shake the feeling that, somehow, the "deep nerds" who were innovating back then did it better and with more heartfelt passion than I can feel today. Of course, tech from 30 years ago seems a bit primitive compared to today -- computer gear is faster and sleeker nowadays. But it seems that the core techniques and core concepts used in much of what is called "innovation" today were invented for the first time one-after-the-other back then, and going back as far as the 1950s maybe. I get the impression that much of what makes billions in profits today and wows everyone is mere improvements on what was actually invented and trail blazed for the first time, 2, 3, 4, 5 or more decades ago. Is there much genuine "inventing" and "innovating" going on today, or are tech companies essentially repackaging the R&D and knowhow that was brought into the world decades ago by long-forgotten deep nerds into sleeker, sexier 21st century tech gadgets? Is Alexa, Siri, the Xbox, Oculus Rift or iPhone truly what could be considered "amazing technology," or should we have bigger and badder tech and innovation in the year 2018?
Books

Ask Slashdot: I Want To Get Into Comic Books, But Where Do I Start? 212

An anonymous reader writes: Hi fellow readers. I don't recall reading many comic books as a kid (mostly because I could not afford them), but of late, I have been considering giving that a shot. I wanted to ask if you had any tips to share. Do I start with paperback editions, or do I jump directly into digital? Also, could you recommend a few good sci-fic comic book series? Thanks in advance!
The Almighty Buck

Ask Slashdot: Should You Tell Your Coworkers How Much You Make? 357

An anonymous reader writes: Asking someone how much money they make is often -- if not always? -- considered impolite. But over the years, there has been a movement in toward more salary transparency. Some say salary transparency can make workplaces more equitable by helping to eliminate the gender and racial pay gaps. Even in companies that haven't decided to officially make all salaries open, some employees are taking matters into their own hands and sharing their pay rate with their coworkers. What's your take on this?
Networking

Ask Slashdot: How Can I Prove My ISP Slows Certain Traffic? 203

Long-time Slashdot reader GerryGilmore is "a basically pretty knowledgeable Linux guy totally comfortable with the command line." But unfortunately, he lives in north Georgia, "where we have a monopoly ISP provider...whose service overall could charitably be described as iffy." Sometimes, I have noticed that certain services like Netflix and/or HBONow will be ridiculously slow, but -- when I run an internet speed test from my Linux laptop -- the basic throughput is what it's supposed to be for my DSL service. That is, about 3Mbps due to my distance from the nearest CO. Other basic web browsing seems to be fine... I don't know enough about network tracing to be able to identify where/why such severe slowdowns in certain circumstances are occurring.
Slashdot reader darkharlequin has also noticed a speed decrease on Comcast "that magickally resolves when I run internet speed tests." But if the original submitter's ultimate goal is delivering evidence to his local legislators so they can pressure on his ISP -- what evidence is there? Leave your best answers in the comments. How can he prove his ISP is slowing certain traffic?
Wireless Networking

Ask Slashdot: Are There Any USB-C Wireless Video Solutions? 127

jez9999 writes: Sometimes it feels like we're on the cusp of a technology but not quite there yet, and that's the way it feels for me after searching around for USB-C wireless video solutions. There are several wireless video solutions that use HDMI on the receiver end, of course, but these aren't ideal because HDMI can't provide power. This means you need a separate receiver box and power cable going into the box, but cables are what you're trying to get away from with wireless video!

So the answer to this would seem to be USB-C. It supports HDMI video as well as power, so in theory you could create a receiver dongle that just plugged into a TV (or monitor with speakers) and required no external power cable. Unfortunately, I haven't been able to find anything like this on the market.

There is Airtame, but that doesn't work with a 'dumb' TV -- it needs to plug in to a computer that you can install software on to stream the video. What I'd like is to be able to wall-mount a new TV and just plug in a wireless dongle to stream the video with no extra setup required on the receiver end.

Does anyone know of a solution like this that exists right now, or one that's being developed?
Books

Slashdot Asks: What Are Some Apps and Online Services You Use To Discover, Track and Evaluate Movies, TV Shows, Music and Books? 84

Earlier this week, news blog Engadget had a post in which the author outlined some of the apps that could help people keep track of TV shows, books, and music habits. A reader, who submitted the story, said the list was quite underwhelming. Curious to hear how Slashdot readers tackle these things.
Windows

Ask Slashdot: Should We Worry Microsoft Will 'Embrace, Extend, and Extinguish' Linux? (betanews.com) 431

BrianFagioli writes: While there is no proof that anything nefarious is afoot, it does feel like maybe the Windows-maker is hijacking the Linux movement a bit by serving distros in its store. I hope there is no "embrace, extend, and extinguish" shenanigans going on.

Just yesterday, we reported that Kali Linux was in the Microsoft Store for Windows 10. That was big news, but it was not particularly significant in the grand scheme, as Kali is not very well known. Today, there is some undeniably huge news -- Debian is joining SUSE, Ubuntu, and Kali in the Microsoft Store. Should the Linux community be worried?

My concern lately is that Microsoft could eventually try to make the concept of running a Linux distro natively a thing of the past. Whether or not that is the company's intention is unknown. The Windows maker gives no reason to suspect evil plans, other than past negative comments about Linux and open source. For instance, former Microsoft CEO Steve Ballmer once called Linux "cancer" -- seriously.

Software

Ask Slashdot: Best To-Do/Task List Software? 278

Albanach writes: Despite searching, I have not identified a good solution for managing to-do lists, a problem that can't be unique or unusual. For a variety of reasons, I need something I host myself, which allows me to organize tasks, give them due dates and/or priorities and to easily reorganize. I'd prefer a web interface so that I can access my list from home/work/mobile. My searches generally turned up hosted solutions that don't work for privacy reasons, or very old software that has shown no sign of updates in years. What are other Slashdotters using to manage their real-world task list?
Education

Ask Slashdot: How Would You Teach 'Best Practices' For Programmers? 220

An anonymous reader writes: I've been asked to put together a half-day workshop whose title is "Thinking Like a Programmer." The idea behind this is that within my institution (a university), we have a vast number of self-taught programmers who have never been taught "best practices" or anything about software engineering. This workshop's intention is to address this lack of formal training.

The question is, what should be covered in this workshop? If you have an idea -- that also has an example of best practice -- please share!

It's really two questions -- what "thinking like a programmer" topics should be covered, but also what examples should be used to illustrate best practices for the material. So leave your best thoughts in the comments.

How would you teach best practices for programmers?
Software

Ask Slashdot: Software To Visualize, Manage Homeowner's Association Projects? 115

New submitter jishak writes: I am a long time Slashdot reader who has been serving on an homeowner association (HOA) board for 7 years. Much of the job requires managing projects that happen around the community. For example, landscaping, plumbing, building maintenance, etc. Pretty much all the vendors work with paper or a management company scans the paper, giving us a digital version. I am looking for suggestions on tools to visualize and manage projects using maps/geolocation software to see where jobs are happening and track work, if that makes sense. I did a rudimentary search but didn't really find anything other than a couple of companies who make map software which is good for placing static items like a building on a map but not for ongoing work. There are tools like Visio or Autodesk, which are expensive and good for a single building, but they don't seem so practical for an entire community of 80 units with very little funds (I am a volunteer board member). The other software packages I have seen are more like general project management or CRM tools but they are of no use to track where trees are planted, which units have had termite inspections, etc.

I am looking for tools where I could see a map and add custom layers for different projects that can be enabled/disabled or show historical changes. If it is web based and can be shared for use among other board members, property managers, and vendors, or viewable on a phone or tablet, that would be a plus. I am not sure how to proceed and a quick search on Slashdot didn't really turn anything up. I can't be the first person to encounter this type of problem. Readers of Slashdot what do you recommend? If I go down the road of having to roll my own solution, can you offer ideas on how to implement it? I am open to suggestions.
Businesses

Slashdot Asks: What Do People Misunderstand or Underappreciate About Apple? (fastcompany.com) 487

In an interview with Fast Company, Apple CEO Tim Cook says people who have not used his company's products miss "how different Apple is versus other technology companies." A person who is just looking at the company's revenues and profits, says Cook, might think that Apple "is good at making money." But he says "that's not who we are. In Cook's view, Apple is: We're a group of people who are trying to change the world for the better, that's who we are. For us, technology is a background thing.

We don't want people to have to focus on bits and bytes and feeds and speeds. We don't want people to have to go to multiple [systems] or live with a device that's not integrated. We do the hardware and the software, and some of the key services as well, to provide a whole system. We do that in such a way that we infuse humanity into it. We take our values very seriously, and we want to make sure all of our products reflect those values. There are things like making sure that we're running our [U.S.] operations on 100% renewable energy, because we don't want to leave the earth worse than we found it. We make sure that we treat well all the people who are in our supply chain. We have incredible diversity, not as good as we want, but great diversity, and it's that diversity that yields products like this.
What do you think?
AI

Slashdot Asks: Which Smart Speaker Do You Prefer? 234

Every tech company wants to produce a smart speaker these days. Earlier this month, Apple finally launched the HomePod, a smart speaker that uses Siri to answer basic questions and play music via Apple Music. In December, Google released their premium Google Home Max speaker that uses the Google Assistant and Google's wealth of knowledge to play music, answer questions, set reminders, and so on. It may be the most advanced smart speaker on the market as it has the hardware capable of playing high fidelity audio, and a digital assistant that can perform over one million actions. There is, however, no denying the appeal of the Amazon Echo, which is powered by the Alexa digital assistant. Since it first made its debut in late 2014, it has had more time to develop its skill set. Amazon says Alexa controls "tens of millions of devices," including Windows 10 PCs.

A new report from The Guardian, citing the industry site MusicAlly, says that Spotify is working on a line of "category defining" hardware products "akin to Pebble Watch, Amazon Echo, and Snap Spectacles." The streaming music company has posted an ad for a senior product manager to "define the product requirements for internet connected hardware [and] the software that powers it." With Spotify looking to launch a smart speaker in the not-too-distant-future, the decision to purchase a smart speaker has become all the more difficult. Do you own a smart speaker? If so, which device do you own and why? Do you see a clear winner, or can they all satisfy your basic needs?
Programming

Who Killed The Junior Developer? (medium.com) 386

Melissa McEwen, writing on Medium: A few months ago I attended an event for women in tech. A lot of the attendees were new developers, graduates from code schools or computer science programs. Almost everyone told me they were having trouble getting their first job. I was lucky. My first "real" job out of college was "Junior Application developer" at Columbia University in 2010. These days it's a rare day to find even a job posting for a junior developer position. People who advertise these positions say they are inundated with resumes. But on the senior level companies complain they can't find good developers. Gee, I wonder why?

I'm not really sure the exact economics of this, because I don't run these companies. But I know what companies have told me: "we don't hire junior developers because we can't afford to have our senior developers mentor them." I've seen the rates for senior developers because I am one and I had project managers that had me allocate time for budgeting purposes. I know the rate is anywhere from $190-$300 an hour. That's what companies believe they are losing on junior devs.

Slashdot Top Deals