WiFi 6 ?

Wi-Fi 6 ?
Get ready for the next generation of wifi (Wireless) technology: Wi-fi 6 is going to be appearing on devices starting in 2019. But, should you replace your old router and get a new one? And is this going to make your Internet run faster? Here’s what you should know !
The history of wifi
Those of you of a certain age will remember when home internet access was only wired—and only one computer could get online, a single MP3 took half an hour to download. Then WIfi came along and changed everything. The first wifi protocol appeared in 1997, offering 2Mbit/s link speeds, but it was only with the arrival of 802.11b and 11Mbit/s speeds in 1999 that people seriously started thinking about home wifi.
Wifi standards, as well as a whole host of other electronics standards, are managed by the IEEE: The Institute of Electrical and Electronics Engineers. Specifically, IEEE 802 refers to local area network standards, and 802.11 focuses on wireless LAN. In the 20 years since 802.11b arrived, we’ve seen numerous new standards of all sorts come out, though not all of them apply to home networking.
The introduction of 802.11g in 2003 (54Mbit/s) and 802.11n in 2009 (a whopping 600Mbit/s) were both significant moments in the history of wifi. Another significant step forward was the introduction of dual-band routers with both 2.4GHz and 5GHz bands, tied to the arrival of 802.11n, which could offer faster speeds at shorter ranges.
Today, with 802.11ac in place, that 5GHz band can push speeds of 1,300Mbit/s, so we’re talking speeds that are more than 600 times faster than they were in 1997. Wi-Fi 6 takes that another step forward, but it’s not just speed that’s improving.
Explaining wifi technology can get quite technical. A lot of recent improvements, including those arriving with Wi-Fi 6, involve some clever engineering to squeeze more bandwidth out of the existing 2.4GHz and 5GHz your router already employs. The end result is more capacity on the same channels, with less interference between them, as well as faster data transfer speeds.
Turning wifi up to six
In the past, Wi-Fi versions were identified by a letter or a pair of letters that referred to a wireless standard. The current version is 802.11ac, but before that, we had 802.11n, 802.11g, 802.11a, and 802.11b. It was not comprehensible, so the Wi-Fi Alliance — the group that stewards the implementation of Wi-Fi — is changing it.
All of those convoluted codenames are being changed. So instead of the current Wi-Fi being called 802.11ac, it’ll be called Wi-Fi 5 (because it’s the fifth version). It’ll probably make more sense this way, starting with the first version of Wi-Fi, 802.11b:
Wi-Fi 1: 802.11b (1999)
Wi-Fi 2: 802.11a (1999)
Wi-Fi 3: 802.11g (2003)
Wi-Fi 4: 802.11n (2009)
Wi-Fi 5: 802.11ac (2014)
Now, instead of wondering whether “ac” is better than “n” or if the two versions even work together, you’ll just look at the number. Wi-Fi 5 is higher than Wi-Fi 4, so obviously it’s better. And since Wi-Fi networks have always worked together, it’s somewhat clearer that Wi-Fi 5 devices should be able to connect with Wi-Fi 4 devices, too. (Technically, Wi-Fi 1, Wi-Fi 2, and Wi-Fi 3 aren’t being branded because they aren’t widely in use, but I’ve labeled how it would look above for clarity.)
The Wi-Fi Alliance even wants to see this branding go beyond hardware. So in the future when you connect to a Wi-Fi network on your phone or laptop, your device will tell you what Wi-Fi version you’re connected to. That way, if two networks are available — one showing “4” and the other showing “5” — you’d be able to choose the newer, faster option.
Now that the retroactive renaming is done, it’s time for the future. If you’ve been closely following router developments over the past year (no judgments here), you’ll know that the next generation of Wi-Fi is on the horizon, with the promise of faster speeds and better performance when handling a multitude of devices. It was supposed to be called 802.11ax, but now it’ll go by a simpler name: Wi-Fi 6.
One of the most important changes Wi-Fi 6 brings with it is, of course, the new naming system: Using a simple succession of numbers is going to make it a lot easier for consumers to keep track of standards and make sure they’ve got compatible kit set up. The more technical term for Wi-Fi 6 is 802.11ax, if you prefer the old naming.
Expect to see the new Wi-Fi 6 name on hardware products and inside software menus from 2019, as well as funky little logos not unlike the one Google uses for its Chromecast devices.
As always, the improvements with this latest generation of wifi are in two key areas: Raw speed and throughput (if wifi was a highway, we’d be talking about a higher maximum speed limit for vehicles, as well as more lanes to handle more vehicles at once). Wi-Fi 6 will support 8K video streaming, provided your internet supplier is going to give you access to sufficient download speeds in the first place.
In practice that means support for transfer rates of 1.1Gbit/s over the 2.4GHz band (with four streams available) and 4.8Gbit/s over the 5GHz band (with eight streams available), though the technology is still being refined ahead of its full launch next year—those speeds may, in fact, go up (it’s been hitting 10Gbit/s in the lab). Roughly speaking, you can look forward to 4x to 10x speed increasesin your wifi.
Another improvement Wi-Fi 6 will bring is improved efficiency, which means a lower power draw, which means less of a strain on battery life (or lower figures on your electricity bill). It’s hard to quantify the difference exactly, especially as Wi-Fi 6 has yet to be finalized, but it’s another step in the right direction for wifi standards—it shouldn’t suck the life out of your phone or always-on laptop quite as quickly.
What will you have to do?
Not a lot. As is usually the case, Wi-Fi 6 is going to be backwards compatible with all the existing wifi gear out there, so if you bring something home from the gadget shop that supports the new standard, it will work fine with your current setup—you just won’t be able to get the fastest speeds until everything is Wi-Fi 6 enabled.
How long that takes is going to depend on hardware manufacturers, software developers, internet service providers, and everyone else in the industry. You might just have to sit tight until your broadband provider of choice deems the time is right to upgrade the hardware it supplies to you (though you could just upgrade the router yourself).
When you’re out and about in the wider world you might start to see certain networks advertising faster speeds, using the new terminology, but this rebrand is brand new: We’ll just have to wait and see how these new names and logos get used in practice. Would you swap coffee shops for Wi-Fi 6?
Bear in mind that it’s also going to take a while for this to roll out properly. When we say 2019, that’s the very earliest that fully approved Wi-Fi 6 devices are going to start appearing on the scene, so it might be months or years before everyone catches up. Some early devices making use of the draft technology have already appeared on the scene.
Even if you have no problems with download and upload speeds right now, Wi-Fi 6 is intended to fix some of the pain points that still exist: Trying to get decent wifi in a crowded space, for example, or trying to connect 20 different devices to the same home router without the wireless performance falling off a cliff.
The Wi-Fi Alliance even wants to see this branding go beyond hardware. So in the future when you connect to a Wi-Fi network on your phone or laptop, your device will tell you what Wi-Fi version you’re connected to. That way, if two networks are available — one showing “4” and the other showing “5” — you’d be able to choose the newer, faster option.
Now that the retroactive renaming is done, it’s time for the future. If you’ve been closely following router developments over the past year (no judgments here), you’ll know that the next generation of Wi-Fi is on the horizon, with the promise of faster speeds and better performance when handling a multitude of devices. It was supposed to be called 802.11ax, but now it’ll go by a simpler name: Wi-Fi 6.
The Wi-Fi Alliance says that it expects companies to adopt this numerical advertising in place of the classic lettered versions. It also expects to see earlier versions of Wi-Fi start to be referred to by their updated numbered names as well.
Because the Wi-Fi Alliance represents just about every major company that makes any kind of product with Wi-Fi in it, its actions usually reflect what the industry wants. So presumably, tech companies are on board with the branding change and will start to advertise it this way.

AMD Ryzen Threadripper2 with up to 32Cores !

AMD Ryzen Threadripper2 with up to 32 Cores, yes that’s correct you did read it right 32 Cores. AMD has quickly ramped up their ZEN Architecture and are now delivering the Threadripper2 (2nd Generation). AMD’s Zeppelin silicon has 8 cores, and the first generation Threadripper uses two of them to get to the top-SKU of 16-cores. Inside the CPU however, there are four pieces of silicon: two active and two inactive. For this second generation of Threadripper, called Threadripper 2 or the Threadripper 2000-series, AMD is going to make these inactive dies into active ones, and substantially increase the core count for the high-end desktop and workstation user. On some other processor designs,they have four active dies, with eight active cores on each die (four for each CCX). On one, there are eight memory channels, and AMD’s X399 platform only has support for four channels. For the first generation this meant that each of the two active die would have two memory channels attached – in the second generation Threadripper this is still the case: the two now ‘active’ parts of the chip do not have direct memory access. Not long ago it was stated by several motherboard vendors that some of the current X399 motherboards on the market might struggle with power delivery to the new parts, and so we are likely to see a motherboard refresh for several Manufacturers. AMD’s Threadripper2 is quite competitive with high-end i7’s and even the new Intel i9’s given the right circumstances. However I would keep in mind that very shortly they may transfer to a new CPU manufacturing process (7nm) that will further increase performance. So buyer beware, do your Homework before making a purchase. AMD is finally starting to give Intel competition. Especially on Price. Threadripper2 from 8 Cores to 32 Cores depending on which CPU flavor !

Worlds Fastest Super Computer

IBM, Nvidia Build “World’s Fastest Supercomputer” for US Government
The DOE’s new Summit system features a unique architecture that combines HPC and AI computing capabilities.
IBM and DOE Launch World’s Fastest SuperComputer
Frederic Lardinois@fredericl / Jun 8, 2018 https://techcrunch.com/2018/06/08/ibms-new-summit-supercomputer-for-the-doe-delivers-200-petaflops/ IBM and the U.S. Department of Energy’s Oak Ridge National Laboratory (ORNL) today unveiled Summit, the department’s newest supercomputer. IBM claims that Summit is currently the world’s “most powerful and smartest scientific supercomputer” with a peak performance of a whopping 200,000 trillion calculations per second. That performance should put it comfortably at the top of the Top 500 supercomputer ranking when the new list is published later this month. That would also mark the first time since 2012 that a U.S.-based supercomputer holds the top spot on that list.
Summit, which has been in the works for a few years now, features 4,608 compute servers with two 22-core IBM Power9 chips and six Nvidia Tesla V100 GPUs each. In total, the system also features over 10 petabytes of memory. Given the presence of the Nvidia GPUs, it’s no surprise that the system is meant to be used for machine learning and deep learning applications, as well as the usual high performance computing workloads for research in energy and advanced materials that you would expect to happen at Oak Ridge.
IBM was the general contractor for Summit and the company collaborated with Nvidia, RedHat and InfiniBand networking specialists Mellanox on delivering the new machine.
“Summit’s AI-optimized hardware also gives researchers an incredible platform for analyzing massive datasets and creating intelligent software to accelerate the pace of discovery,” said Jeff Nichols, ORNL associate laboratory director for computing and computational sciences, in today’s announcement.
Summit is one of two of these next-generation supercomputers that IBM is building for the DEO. The second one is Sierra, which will be housed at the Lawrence Livermore National Laboratory. Sierra, which is also scheduled to go online this year, is less powerful at an expected 125 petaflops, but both systems are significantly more powerful than any other machine in the DoE’s arsenal right now.

Karl Freund
Karl Freund is a Moor Insights & Strategy Senior Analyst for deep learning & HPC
Summit, at the Oak Ridge National Laboratory in Oak Ridge, Tennessee. Capable of over 200 petaflops (200 quadrillion operations per second), Summit consists of 4600 IBM dual socket Power 9 nodes, connected by over 185 miles of fiber optic cabling. Each node is equipped with 6 NVIDIA Volta TensorCore GPUs, delivering total throughput that is 8 times faster than its predecessor, Titan, for double precision tasks, and 100 times faster for reduced precision tasks common in deep learning and AI. China has held the top spot in the Top 500 for the last 5 years, so this brings the virtual HPC crown home to the USA.

Some of the specifications are truly amazing; the system exchanges water at the rate of 9 Olympic pools per day for cooling, and as an AI supercomputer, Summit has already achieved (limited) “exascale” status, delivering 3 exaflops of AI precision performance. What may be more important, though, is the science that this new system will enable—it is already at work on drug discovery using quantum chemistry, chronic pain analysis, and the study of mitochondrial DNA.
For those who cannot afford a full-fledged $100M supercomputer, NVIDIA also announced the new HGX-2 chassis, available from many vendors, which can be connected to a standard server for some serious AI in a box. DGX-2 supports 16 Volta GPUs, interconnected via the new NVSwitch networking to act as a single massive GPU, to deliver 2 petaflops of performance for AI and HPC. As you can see, NVIDIA is paying a lot of attention to the idea of fusing AI with HPC.

The scientific advances in deep neural networks (DNNs) for HPC took center stage in the announcement. As I have noted in previous articles, DNNs are showing tremendous promise in High Performance Computing (HPC), not just on DNNs can be trained with massive datasets, created by running traditional simulations on supercomputers. The resulting AI can then be used to predict outcomes of new simulations with startling accuracy and can be completed in 1/1000th the time and cost. The good news for NVIDIA is that both supercomputing and AI are powered by—you guessed it, NVIDIA GPUs. Scientists have even more tools to use GPU hardware and to develop GPU software with NVIDIA’s new platforms.

The announcement of Summit as the world’s fastest computer was not a surprise; as a public project funded by the U.S. DOE, Summit has frequently been the subject of discussion. What is significant is that NVIDIA and the DOE believe that the future of HPC will be infused with AI, all running on the same hardware. The NVIDIA GPUs are delivering 95% of Summit’s performance, cementing the legitimacy and leadership of GPU-accelerated computing. HGX-2 makes that an affordable path for many researchers and cloud providers, while Summit demonstrates the art of the possible and a public platform for research. When combined, AI plus HPC also paves the way for future growth for NVIDIA.

The Summit system, with 9,216 IBM processors boosted by 27,648 Nvidia graphics chips, takes as much room as two tennis courts and as much power as a small town. It’ll be used for civilian research into subjects like material science, cancer, fusion energy, astrophysics and the Earth’s changing climate.

Summit can perform 200 quadrillion (200,000 trillion) calculations per second, or 200 petaflops. Until now, the world’s fastest supercomputer has been the Sunway TaihuLight system at the National Supercomputing Center in Wuxi, China, capable of 93.01 petaflops.

Graphene the Future of Computing ?

Graphene the Future of Computing ?
Could make your computer a thousand (1000x) times faster.
Superconductive and Ultra-Thin
Conducts electricity 10x times better than copper, and 250 times better than silicon
Researchers built A transistor (Circuit) from graphene and applied current resulting in 1000 times increase in performance

Graphene Computers Work 1000 Times Faster, Use Far Less Power

Graphene-coated copper could dramatically boost future CPU performance
• By Joel Hruska on February 21, 2017

Graphene-coated copper could dramatically boost future CPU performance

IBM builds graphene chip that’s 10,000 times faster, using standard CMOS processes

While current chips are made of silicon, the prototype processor is made of graphene carbon nanotubes, with resistive RAM (RRAM) layered over it. The team claims this makes for “the most complex nanoelectronic system ever made with emerging nanotechnologies,” creating a 3D computer architecture.
If you follow a lot of tech circles, you may have seen graphene (a super-thin layer of carbon arranged in such a way that it has electrical properties verging on miraculous) come up in the news quite a bit, receiving plaudits about its massively fluid electrical conductivity and possible applications in several different technologies. What you haven’t heard much of is the ugly part of graphene: It’s impossible to build semiconductor transistors out of the material as it stands now since it has no electrical band gap to speak of. If that sounds confusing, that’s alright. That’s what this article is for!
Band Gap? What’s That?
A band gap is a tiny space in between a conduction band and a valence band that tells us at what level current will actually flow between the two. It’s like a little gatekeeper that keeps an electrical charge in one space until it is “turned off.” Virtually all chips on computers are made of a semiconductor material, which means that it has a moderate band gap that makes it neither conduct electricity so readily nor reject every electrical charge. This has to do with basic molecular structure, so there is quite a bit of chemistry involved in building a chip.
Very large band gaps exist in materials like rubber which will resist electrical currents so much that it would much rather catch fire than retain the charge. That’s why you use rubber to insulate the wires inside of cables. Materials with a negligible band gap are known as conductors, while those with virtually no band gap whatsoever are known as superconductors.
Today most chips are made of silicon, which serves as a very sturdy and reliable semiconductor. Remember, we need semiconductors that can quickly be turned on and off at will, not superconductors, which will lose the charge they were given the moment the band no longer supplies it.
Why Is Graphene Not Good for Building Chips?
As I mentioned earlier, graphene is an extremely efficient conductor of electricity but nothing much more than that. It can push a charge at an incredible speed, but it cannot retain it. In a binary system you may need to retain data so that your running programs don’t just close the instant they open. It’s important in a RAM chip, for example, to ensure that the data inside it can stay put and remain readable for the foreseeable future. When a transistor is in the “on” state, it registers a “1.” In an “off” state, it registers a “0.” A superconductor would be unable to “switch off” because the difference between “on” and “off” voltage is so small (because of the tiny band gap I mentioned earlier).
That’s not to say that graphene wouldn’t have a place in a modern-day computer. It certainly could be used to deliver information from one point to another quickly. Also, if supplemented by other technology, we could possibly see graphene used in transistors at some point in the future. Whether that would be an efficient investment of capital is up to the industry to decide.
There’s Another Material! (One I believe has more promise)
One of the problems with silicon is its inflexibility when working on ultra-thin surfaces. A piece of silicon could only be shaved so thin for it to be functional. That’s why we were exploring the use of graphene in the first place (it’s one single atom thick). Since graphene may not prove promising without investing truckloads of money into its development, scientists began trying other materials, one of which is titanium trisulfide (TiS3). The material not only has the ability to function even at the thickness of a single molecule, but it also has a band gap very similar to that of silicon.
The implications of this are far-reaching for miniature technology products which pack a vast amount of hardware in a very constrained amount of space. Thinner materials will also dissipate heat more efficiently, making them favorable for large power-hungry computers.
Graphene As A Promising Material For Computer Processors
From the time Graphene technology has been introduced, It has gained popularity as one of the most advanced materials with diverse applications. It can be used in mechanical and biological engineering applications. Car manufacturers are taking advantage of its weight and strength, Making it an excellent choice of materials to be combined with polymer composites.
It is also popular as a choice for energy storage and for solar cells. Nonetheless, Recently, It has also generated buzz because of the introduction of the Graphene processor, Which is expected to improve computing in more ways than one. IBM Taking Advantage of Graphene Among others, IBM is one company that has expressed its serious commitment to building a Graphene processor, Which is expected to redefine the future of computers.
By 2019, The company expects to develop a processor that is smaller and more significantly powerful than what is available in the market today. The goal is to build IBM Graphene transistors that measures only 7 nanometres but unrivalled in terms of the power it can provide to the computers of the future. As a demonstration of being serious in the pursuit of this component in a Graphene CPU, The company has invested $3 billion to provide the funding necessary for the development of the technology and in having it polished before finally being introduced in the market.

The Technological Singularity

The technological singularity or simply The Singularity is the belief that the invention of artificial superintelligence (ASI) in combination with Neurochips will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization and humans themselves. Many refer to the human change as becoming Cyborgs (Half-Machine and Half-Human). John von Neumann, Vernor Vinge and Ray Kurzweil define the concept in terms of the technological creation of super intelligence. They argue that it is difficult or impossible for present-day humans to predict what human beings’ lives would be like in a post-singularity world. Some such as myself, believe that the combination of such technologies as Molecular NanoTechnology, Neurochips (Computer Interface with Human Brains), and Artificial Intelligence may all combine to radically change the entire World and Human Society. Some even speculate that AI might bring about the end of Mankind. The Singularity may happen within the next 20 or 30 years if not sooner. A neurochip is a chip (integrated circuit/microprocessor) that is designed for the interaction with neuronal cells. In Science-Fiction there have been many stories of Cyborgs who have many mechanical or Machine based parts. However, I believe we are coming to a period where we may be able to use organic systems that may be far more compatible. We are even seeing the dawn of being able to create human organs or parts for replacement of failed organs. “Neuromorphic computing—the next big thing in artificial intelligence—is on fire.” This is a quote from an article from February 2018 by Shelley Fan ! Cybernetics is considered to be the Science of integration of Computers with people (inside of them). We may be in for some very interesting but also dangerous times ahead !

Now Available AMD Ryzen Laptops

HP, Lenovo, Acer and Dell are all now displaying Ryzen Laptops
Several Other companies have also joined the Bandwagon.
Prices range from about $580 to $800+ or even $1500+ for Ryzen 7
with 17 inch Screen.
Here is an example :- Acer Laptop Swift 3 SF315-41-R5LE AMD Ryzen 7 2700U (2.20 GHz) 8 GB Memory 256 GB SSD AMD Radeon Vega 10 15.6″ Windows 10 Home Price $921.00 (U.S.)

Another example is : –
HP Unveils EliteBook 700 G5, ProBook 645 G4 Laptops with Ryzen PRO [UPDATED]
by Anton Shilov on May 9, 2018 3:00 PM EST
includes the 735, 745, and 755 models (Different Screen Sizes etc….)

8K Video Editing capable systems

This particular build is the AMD Ryzen Threadripper 1950X processor available on board. Based on AMD new’s game-changing ThreadRipper CPU architecture, the unit provides enough processing power for seamless 8K video editing in real-time. Moreover, the performance benefits of ThreadRipper surpass those of the Xeon and i9 line built by the engineers of team blue (Intel). Looking at the Cinebench scores comparing ThreadRipper, i9, and a Xeon processor, we found that AMD can achieve 37% better performance compared to Intel’s line of processors.
Even with both the i9 and ThreadRipper processors overclocked, AMD still manages to pull ahead of team blue’s flagship product. AMD’s ThreadRipper architecture can be hailed as an example of the progression of processor speeds today. A Geekbench comparison, where we compared the speeds of the Intel 6700K, the former standard for efficient video editing, against other processors, including the ThreadRipper.
It goes without saying that the other essential component for the efficiency of every video editing workstation, besides the processor, is the graphics card. In this build, we put to the test three different GPUs – the AMD Vega Frontier (Brand New) for those using DaVinci Resolve, NVIDIA 1080 Ti for those who want to take advantage of CUDA support when editing in Adobe Premiere Pro, and the AMD Radeon RX580 for those working on a budget. At the end of the day, the ThreadRipper-based beast was able to complete each task with low rendering times and no perceivable issues.
Regarding efficiency, the ThreadRipper/Vega combination blows the competition out of the water. After placing 14 LUTs on a 4K clip inside of DaVinci Resolve, the program was able to playback the footage flawlessly with only 35% CPU usage and 65% GPU usage, now we’re talking Editing Power !

If you think this editing PC wasn’t jaw-dropping enough, keep in mind that it can even handle the playback of 8K footage, which is the equivalent to playing four streams of 4K footage simultaneously. This type of performance is absolutely insane, especially when you consider that’s not edit-friendly codecs like ProRes or CineForm, but REDCODE Raw 8K footage. The fact that this machine can even handle such quality is an indisputable evidence of its computing power.

Meanwhile, here’s the full list of components you will need to build this 8K Video Editing Beast from scratch.
 Corsair Crystal Series 570X RGB Computer Case (B&H, Amazon US)
 EVGA SuperNOVA 850 G2 Power Supply (B&H, Amazon US)
 AMD Ryzen Threadripper 1950X CPU (B&H, Amazon US)
 ARCTIC Liquid Freezer 360, High-Performance CPU Water Cooler (Amazon US)
 GIGABYTE X399 AORUS Gaming 7 Motherboard (Amazon US)
 Ballistix Sport LT 64GB Kit RAM (B&H, Amazon US)
 Samsung 960 EVO Series – 1TB PCIe NVMe SSD (B&H, Amazon US)
 Corsair SP Series, SP120 RGB LED, 120mm High-Performance RGB LED Fan (B&H, Amazon US)
 AMD Radeon RX Vega Frontier 64 8GB Graphics Card, used in the video (for DaVinci Resolve) (B&H, Amazon US)
 NVIDIA GTX 1080 Ti 11GB Graphics Card (for Adobe Premiere Pro) (B&H, Amazon US)
 AMD Radeon RX 580 8GB Graphics Card (the budget option) (B&H, Amazon US)

As an alternative, here is some publicity from HP about the Power of their workstations : –

September 14, 2017
HP have revealed a truly insane powerhouse of a PC
Housing dual Xeon CPUs with 56 processing cores, Hewlett Packard’s new Z8 workstation takes up to 3TB of RAM (I know!) and 48TB of storage space. While a fully decked Z8 might be out of the price range of most creatives, the base price isn’t actually that bad, at a mere $2,439. For comparison, the 6 core Apple Mac Pro with 16GB RAM starts at $2,999.
There are actually three systems in HP’s new “Z” range. There’s the most powerful, the Z8, and then the imaginatively named Z4 and Z6 for not-quite-so-power-hungry users. Even the lowly Z4, though, is rather impressive.
Starting at only $1,249, the Z4 has a single 18 core Intel Xeon W-2155 CPU, up to 256GB RAM and 4TB internal storage. It’s aimed primarily at 3D CAD users. The Z6 starts at $1,919, for the base model, with 48 cores of Intel Xeon Platinum 8180 processor and supports up to 384GB RAM. The Z6 does support dual processors, too, and this is aimed more at photographers, video editors and visual effects artists.
The Z8, though, is built for the seriously hardcore. With 56 cores, 3TB RAM, 48TB storage, and 3x Nvidia Quadro P6000 graphics cards, it’s an absolute monster. Getting one to that spec, though, is probably going to be substantially more than the $2,439 base model price.
Of course, if you’re actually shooting and editing in 8K, even fully loaded, this computer is likely going to be nothing compared to what you paid for the camera.
HP Z4, Z6 and Z8 prices start at $1,249, $1,919 and $2,439 respectively, and should start to become available from October for the Z6 and Z8 and November for the Z4. You can find out more about them on the HP website.
Very sexy sounding machines, although personally I think I’ll stick to building my own. It’s easier on the wallet.
And, yes, there’s probably a typo on HP’s website. the Z4 doesn’t really support up to 256TB of RAM.

What are the system requirements to edit and produce an 8K video?
There are not actually hard-and-fast rules. Things like FCP X and Premiere Pro CC actually support 8K video editing now. You’ll want to make sure that you have a GPU supported by either product, and plenty of CPU and RAM (16GB at least).
However, most of all, you need blazing fast disks. USB3 is probably not going to cut it. You’ll need either an internal disk array (very large, very fast) or an external array on Thunderbolt or, better, Thunderbolt 2.
Mind you, if you are spending $100,000 for an 8K camera (like the Red WEAPON 8K), the computer’s really the least of your worries. You can buy whatever the manufacturer suggests.
Displaying your 8k video will be expensive. This size of video is currently used in huge venues and has two or three massive projectors. To avoid having the seam where the projectors overlap on a video one or two of the projectors will play a neutral background while the 2K/4K video is playing. A software/hardware presentation device is used to separate out the projector feeds.
But the big question is why 8K? No commercially available cameras can shoot it, no monitor can display it at full res and no single projector can project it without downscaling it. If you are looking to futureproof it a simple 4K version is way way easier to do on all fronts measurable and will be a viable standard for years to come. In fact a great deal of content that is effects heavy is still rendered in 2K to save render time.
In 2014 only 1% of American homes had 4K TV’s. We are sitting at about 10% now in early 2016 and estimates are that by 2020 it may be around 40-50% (source New York Times) that is just 4k.
8k over 4k will definately be on the side of diminishing returns.
The quantum jump from B&W TV to color was amazing. Everybody was blown away.
The hop from VHS to DVD was jaw dropping.
The switch from standard def to HD was stunning.
Going from HD to UHD is a nice improvement for some content.
The switch from 4k to 8k (whatever it will be called) will be ok but most people with average sized 8K TV’s will not even notice the difference over their old 4K set.
By 16k nobody will care anymore.

Télé et Ciné a Volonté !

Comme écrit dans un Article précédent (Boite TV), la Télé et le Cinéma a volonté est maintenant une chose de disponible. Par contre il est essentiel de faire des recherches et découvrir quels sont nos options ! Une visite au site ou les pages Web de les chaines locale est une bonne façon de découvrir des émissions gratuite en toute liberté ! Parfois ces mêmes Chaines de télévision on un service en ligne aussi, parfois gratuit et autres fois payant. Achat d’appareil tel que Apple TV ou Roku, ou même un Téléviseur avec Roku intégré est une façon rapide de profiter et faire essais de ces services. Apple Store et Google Play Store propose souvent des locations de Films ou des émissions de télésérie avec des prix de 99 sous a $3,00 ou plus (vérifié les termes des fois vous avez juste 48 heures afin de utilisation). Autres Services ou Site ONF.ca Gratuit (office National du Film), BANQ.qc.ca Gratuit (Biblioteque et Archives Nationales du Québec), Video.tva.ca Gratuit, Tou.tv ou ICI.tou.tv gratuit ou $6.99/mois. Plusieurs autres services Payant tel que Netflix, Amazon Prime Vidéo, etc… Lire article précédent sur les Boite TV pour plus d’ information !

Boite TV ?

Libérez-vous de votre câblodistributeur ! Une Boite TV ? Comme Apple TV, Roku box ou autres genre de technologie, la Boite TV donne accès a une énorme panoplie de Services et de Média disponible sur Internet. Un tel boitier est tout simplement un ordinateur avec des connexions et des ports pour permettre utilisation de Internet pour visionner ou téléchargé des Émissions TV et des Films. La plupart des Boites TV sont Androide ou Windows. Androide, est le même système d`opération que votre téléphone cellulaire (Sauf si Apple) ou votre Tablet, ainsi que plusieurs autres millions d’appareils a travers le Monde. Vous pouvez aussi téléchargé un Programme ou un fichier a votre boite tel que KODI et utilisé de cette façon. KODI est le développement de ce qui était a l’ origine XBMC (X-Box Media Center) a une version plus moderne (Kodi est maintenant a la version 17.6 Stable). Comme des millions de personnes au Canada et aux États Unis coupez les frais associés à votre service Câble ou autres. Accédez à plus de contenu. L’information d’intérêt publique la moins médiatisé par les Gouvernement Canadien et Américain (U.S.)
Le gouvernement Canadien a mis bien peu d’effort pour informer la population Canadienne de la gratuité de la réception du signal HD (haute définition) dans tous les foyers Canadiens.
Par exemple les États-Unis et la France ont des sites Web d’information sur la réception du signal HD, vous pouvez y retrouver une carte des positions des antennes émettrices du signal HD partout sur leur territoire. Ces outils sont inexistant au Canada, se qui fait bien l’affaire des fournisseurs de service payants de signal de télévision par câble ou par satellite. Avec une bonne Antenne il est possible de captez presque 21 Canaux ou Poste tel que Liste des canaux de télévision dont vous pouvez recevoir le signal en HD
Cette liste peut-être différente en fonction de la région du Québec ou vous installez l’antenne HD.

Canaux de télévision Canadiens : Radio-Canada – CTV – Global – V – Télé-Québec – CBC – City – Canal Savoir – TVA – ICI
Canaux de télévision Américains dont la réception se fait au Canada : CBS – FOX – NBC – ABC – PBS KIDS – CW TV – MeTV – Thirteen PBS – MHz Worldview – PBS – Create TV – World Compass – Local News and Weather

Plusieurs personnes sont furieuses quand elles apprennent qu’elles peuvent recevoir gratuitement la télévision en HD (Haute Définissions) à partir d’une simple antenne HD alors qu’elles paient mensuellement un abonnement. D’autant plus que le signal provenant de l’antenne est généralement de meilleure qualité que celui fourni par le câble ou par un satellite. Les fournisseurs compriment souvent leurs signaux afin d’inclure davantage de canaux.

Est-ce légal?
Au Canada, l’écoute en continu est légale car cela n’implique pas de réel téléchargement, l’œuvre n’étant pas enregistrée sur la console, La loi s’attaque à ceux qui rendent le contenu disponible sans autorisation préalable, et non à ceux qui accèdent audit contenu. Vous pouvez donc visionner le contenu désiré en toute tranquillité d’esprit et en toute légalité. Par ailleurs, il est impossible de retracer les personnes qui ne font que visionner du contenu en ligne sans le télécharger. La console ne fait que vous diriger vers des liens Internet en toute sécurité en écartant des potentiels virus ou d’autres programmes malveillants qui pourraient se trouver sur des sites d’écoute en continu.

Certaines personnes vous dira pourquoi une telle chose, alors pour moi aucun frais depuis plus de 20 mois pour tous mes postes et contenu, aucun frais de Cinéma, aucune Facture gonflé ! Les épargnes peuvent être énormes ! Aussi le nombre de Canaux et/ou les Films disponible dépasse de loin ce que mon fournisseur avait au prix que je payais ! Vous avez peut être entendu plusieurs différent termes ou des noms pour cette technologie tels que Streaming media device, TV box, IPTV box (IPTV services are usually a Monthly billing facturation mensuelle), set top boxes, media streamer, HTPC, Kodi box, et mon terme préféré Boite Android TV. Ils sont tous plus ou moins le même genre de chose – un appareil qui va chercher du contenu de votre réseaux ou Internet pour le projeter sur votre écran TV. Toutes les Télévisions moderne avec un port ou une connexion HDMI peuvent être utilisé avec une Boite TV. Une des parties les plus importantes est que si vous désirez avoir de la Haute Définition assurez vous d`avoir un service rapide (20+ Mbps faire le test avec speedtest.net) avec une bonne Bande Passante et beaucoup téléchargement (150Gb+) (Consulté votre fournisseur Internet pour les détails de votre service.)

N’oubliez pas de vérifier la capacité de téléchargement de votre forfait internet.
Comme vous visionnez les films et les télé-séries de votre TV BOX via internet, vous devez vous assurer d’avoir un forfait internet adéquat. Nous vous suggérons un forfait illimité de téléchargement. Surveillez votre utilisation de bande passante, sans quoi votre fournisseur pourrait vous facturer des frais de dépassement.
Exemple approximative: Pour écouter un film sur votre TV BOX de 1h30, vous utiliserez jusqu’à 3Go de votre transfert.

Maintenant il est possible de faire achat de TV avec Roku comme service dans votre téléviseur. Moi je possède une Télévision Roku ainsi que une Boite TV, avec deux Antennes externes qui me donne accès a toutes les stations / Canaux locaux disponible ! Soyez prudent si vous magasinez pour une Boite TV, plusieurs offres des Boites avec la technologie vielle de il y a deux ou trois ans ! Vous devriez vous offrir la meilleure et la plus récente que votre Budget vous permet avec autant de mémoire et stockage que possible. Les Boites Androide avec Kodi Installer sont les plus populaires présentement. En termes de technologie un minimum de S912 Octa Core CPU and 3GB RAM Memory 32GB ROM Storage/Stockage comme base. Certaines boites offres maintenant DDR4 Ram Memory qui est supéreur a DDR3 ! Achetez 2Gb ou plus de mémoire car 1Gb sera trop faible ! Two (2) GB RAM mémoire est un minimum dans mon opinion ! Évitez les anciens CPU (Coeur) tel que le S905 etc..
A votre KODI vous allez ajouter des Add-On tel que, Exodus, Covenant, Phoenix, SALT, ou autres. Pour le bonne résolution image Graphics/GPU devrait etre ARM Mali-T820 ou plus pour la Haute Définition High-Def ou 4K (Supporté par votre TV / TV must support High-Def or 4K in order to view it). version Androide 6.0 ou 7 pour la meilleure compatibilité. Les Prix varie entre approximatif $75 et $120 pour les Modeles plus Haute Gamme! Faites vos recherches avant de faire un achat !