We could first see the Retina display on the iPhone 4 in 2010. After that, the very high-resolution display made its way to the iPad tablets and then to the MacBook Pro. Today, Apple introduced the 27-inch iMac desktop computer to the world, featuring a display with a respectable 5K resolution.
If you want to know the exact numbers, it's a resolution of 5120 x 2880 pixels, which makes the iMac the absolute leader among desktops. 14,7 million pixels - that's exactly how many you'll find on a 27-inch display. You can play seven Full HD movies side by side or edit a 4K video and still have plenty of space on your desktop.
The entire panel contains 23 layers that occupy only 1,4 millimeters. In terms of energy, the new Retina 5K display is 30% more efficient than the standard display supplied in the 27-inch iMac. An LED is used for the backlight, the display itself is made of TFT (thin film transistor) based on oxide, i.e. Oxide TFT.
Since the Retina 5K display contains 4 times more pixels than the display of the previous iMac, it was necessary to change the way of directing. Apple therefore had to develop its own TCON (timing controller). Thanks to TCON, the new iMac can easily handle a data stream with a throughput of 40 Gb per second.
At the edges, the iMac is only 5 millimeters thick, but of course it bulges in the middle to accommodate all the hardware. The basic equipment of the iMac received a quad-core Intel Core i5 processor with a clock speed of 3,4 GHz, for an additional fee Apple will offer a more powerful 4 GHz i7. Both processors offer Turbo Boost 2.0, which automatically increases performance whenever needed.
AMD Radeon R9 M290X with 2GB DDR5 memory takes care of graphics performance, and for an additional fee you can get AMD Radeon R9 M295X with 4GB DDR5 memory. As for the operating memory, 8 GB (1600 MHZ, DDR3) will be offered as a base. Four SO-DIMM slots can then be fitted with up to 32GB of memory.
You get 1 TB of Fusion Drive storage for your data. You can configure up to 3TB Fusion Drive, or 256GB, 512GB or 1TB SSD. You won't find standard hard drives in an iMac with a 5K Retina display, and there's nothing to be surprised about.
And now for connectivity – 3,5mm jack, 4x USB 3.0, SDXC memory card slot, 2x Thunderbolt 2, 45x RJ-4.0 for gigabit ethernet and slot for Kensington lock. From wireless technologies, the iMac supports Bluetooth 802.11 and Wi-Fi XNUMXac.
The dimensions of the computer (H x W x D) are 51,6 cm x 65 cm x 20,3 cm. The weight then reaches 9,54 kilograms. In addition to the iMac itself, the package includes a power cable, a Magic Mouse and a wireless keyboard. Price starts at Apple Online Store at 69 crowns.
69 :-)
Apple has an interesting calculator :D
Do you mean that he expects a worse business environment in Europe? E.g. the mandatory two-year warranty increases costs and thus the price. Add some protection against exchange rate fluctuations and a brutal 21% VAT, and the 1USD=1EUR rate already makes sense.
If I remember correctly in the USA the price will be approx. 52 CZK.
US prices are always quoted without tax. So multiply USD 2499 by the exchange rate, add 21% Czech VAT to the result, and you're at 65. CZK
If you buy in the US via the internet and send it to a country other than the one where the e-shop is domiciled, the tax is not paid. And every state has a slightly different tax, even up to a maximum of 4-5%, compared to the Czech Republic, it's a paradise...
Jirka: In my opinion, you always pay the tax of the state to which the goods are delivered, or in cases when I bought something, it was always like that, the tax is different in every state. In any case, if you would like to import it officially (I mean that it is difficult to stuff the iMac into your personal luggage on the plane and smuggle it), then pay the CZ tax and in the final you will have it for a very similar amount of money as the stated Czech price and in addition only with a one-year warranty.
…max 4-5%? That's probably hard... just google "sales tax".
There probably won't be a separate display yet. Shame :(
There is nothing that can be connected to it :-) Thunderbolt 2 cannot transfer such a resolution :-) until the new thunderbolt can do it ... of course there is a way around it .. it would be enough if the display looked like two displays and then the thunderbolt would 2 was able to transfer ... but unfortunately ... it seems that until the next devices will be able to work with a 5K display
Even the first thunderbolt always has sufficient data flow, let alone two. I think it's only a sw update, right?
The data stream is one thing and the video signal is another :-) Thunderbolt 2 can only transmit video up to 4K resolution to one monitor...
in that case, very disappointed that they didn't list the standard that can handle it or didn't list the monitor in 4K instead.
There are several options here and nothing is good:
– the separate display will remain at today's sub-average resolution
– a separate 4K display will have a better resolution than the iMac
– a separate 5K display cannot be connected to the current Mac Pro and Mac Mini
Ok, I'm the only one who finds the Intel Core i5 (3,5GHz) and 8GB RAM (frequency?) configuration of the Retina iMac quite humorous :D Also, switching to Radeons just as Nvidia introduced the GTX 970 and GTX 980 - basically the best cards "consumer" class...
They didn't do much with this, except for the display, which I consider useless - I'm sitting in front of a 27″ iMac min 2010, so I can judge that the pixel density is sufficient. The display may be great for editing 4k video, but the rest of the hardware is short for it, and if I have a 4k camera, I'll probably get a Mac Pro by now.
Well, at least it convinced me to build a Hackintosh: intel Core i7 4790K (Overclocked to approx. 4,7 GHz), Nvidia GTX 970 SC, 32 GB RAM (2400 MHz), 2x 512 GB SSD in RAID0... Price about $2000
I think the configuration is perfectly fine, I just don't want to use the new NVIDIA ones as you write..
It looks ok to me too. That is, except for the 8 GB of RAM in the base, when the Macbook Pro 15″ has 16.
The iMac has the advantage of being user replaceable. And it's much cheaper to buy them on the market than from Apple :-)
Well, that's really great. As they were replaceable in the MBP, it was easy to get into it and it made sense, I did it myself. But what kind of fool will lie through the display because of the cheaper RAM in the iMac?
In the 27″ iMac, they change normally from the back. Work for a few minutes. And according to the information on the Apple website, I will change like this in the retina iMac as well.
As dfx says - USER
For video editing, Radeons are better than Nvidia. They have significantly better performance in OpenCL, which is used by e.g. FinalCut Pro X.
I edit in Premiere and After Effects, so CUDA cores are more important to me. OpenCL works very well with Nvidia's webdrivers on Mac as well. The i5 and 8GB of RAM are not particularly suitable for editing, especially rendering and coding will be quite annoying. :D
I don't understand why they didn't introduce a straight 5K display, they can always use the same panel :(
Only Apple can really sell a two-year-old graphics company. Disappointing for me, let's not be enchanted by the wow 5K display. Maybe good for fans watching TV series, but HW can barely handle a movie, let alone a game at such a resolution. It's a mockery... A decent configuration is possible with 4GB of VRAM and a 4GHz processor plus 16GB of RAM, but that roughly translates to 70.000 CZK plus duty and VAT, so around 80.000 here and I'll actually buy that urn or Mac Pro cylinder :)
You're right next to it. Hello
Obviously a troll, you write the same thing under the next article... So I add the same answer again - NVIDIA is much better, but don't be biased. What Apple introduced in the iMac is the CURRENT offer of the best ATI cards on the market. If you choose ATI like many other manufacturers, you can't offer anything better.
I should also add that this is not a machine designed for games. You can't run current games in 4K, let alone 5K, on any computer today. Let alone in ALL-IN-ONE, a scrap of judgment. NO current game machine is and cannot be made for 4K or 5K, so I don't see why you blame it on the real iMac, which doesn't even claim this role. The graphics card is only there for video acceleration and display of the entire system in 5K. Whether ATI or NVIDIA is there is more important in this case than elsewhere, because there is no chance of running 5K games with either company.
It's not an old graphics card, but see my first answer. Otherwise, I don't know about the series being broadcast in 4K/5K, teach me. Otherwise, GB RAM has nothing to do with display resolution.
From my point of view, this machine is mainly made for multimedia entertainment such as browsing the web, photos, movies and for work when editing videos and especially editing photos and DTP.
OK. So what will the iMac 5K be designed for, if not for games? On graphics? Then you can speculate about the quality of the display for this purpose, fidelity of color rendering, gamut, etc. I'm just putting in the plenary what and who is such an iMac 5K intended for? To the office on Word? For that kind of money?
For video editing, the best 4K videos. Or for a software development developer.
Editing 4K video with a 2-year-old 2GB graphics card? Come on …
The graphics have the latest technology from AMD Mantle, which did not exist two years ago. Next, OpenCL has better support than nVidia, which pushes its CUDA. I wouldn't say it's a 2 year old card. BTW, for 250 euros, you can pay extra for a more powerful one with 4GB of RAM.
That's right, I'm quoting from the link http://notebook.cz/clanky/technologie/2014/amd-radeon-r9-m290x-stale-stejny-kral-mobilniho-segmentu-amd: "The throne of the most powerful graphics accelerator is currently held by NVIDIA,
but AMD is still within range. What kind of weapon is the top of its current range, the R9 M290X?
It should be noted that there is nothing to cheer about. The biggest technological change that AMD introduced in this card does not need to be explained at length, it is a simple re-branding, i.e. renaming. The intention itself is good, shortening the word Radeon to R and adding a number to make it easier to distinguish the performance class (here 9 for the highest series) mi
it will save a number of keystrokes and users will certainly find their way around the menu faster. Compared to the long four-digit codes of the previous generation, this is definitely a step forward. However, AMD intended to benefit from the new name right away, so under the "hood" of the new name we find the old Radeon HD card
8970M, with virtually no changes.
And I can continue to lament. The Radeon HD 8970M is actually not a genius mobile card either, but a desktop affair released in March 2012. So AMD is the king of the mobile segment for two years
an old desktop card with cut frequencies and a different layout of the components on the board (because with lower frequencies they heat less and can therefore be closer together). Something doesn't seem quite right here. There's only one bright side to the whole thing - always wanted desktop performance in a laptop?
you are really off..
So, first of all, that card is not for games, even if games adapted to Mantle will run on it without problems even in the 5K resolution. And secondly, the software for it is made by Apple, and Apple will make sure that everything runs smoothly on it. AMD also uses graphics in the Mac Pro, so it won't have much work.
And that's what I wanted, that no processor manufacturer has the ability to release a new processor with a new architecture every year. Not even Intel does it, not even nVidia. A new architecture is always released once every 5 years, and otherwise the current one is only improved, new functions are added, consumption is reduced, etc.
Are you rendering the video with a graphics card?
What wouldn't? When it's faster than CPU :)
What nonsense, building a gaming machine to play today's games in 4k is not a problem. You can also buy one already built and the price will be roughly similar to what you pay for this imac.
really? so please show it to me, I'm curious how you can fit into $2500, when it will only cost you a monitor, if you really want to compare :D
I am not writing about the quality of the monitor, I am responding to your words that "NO current game machine is and cannot be made for 4K or 5K" which is simply not true. The first link on google is for Hall 300 Samvidia 4k price 85 thousand including monitor. Which is simply not far from the price of an iMac (which, as I read below, costs 65 in the Czech Republic, and its performance is somewhere else.
you're really just taking my word for it, ok there is a overclocked computer that can handle SOME games in 4K at 30FPS just like that. In 5K, not by mistake. As for the really demanding ones - just because they are, doesn't mean they'll really master it. Again, I'm taking you at your word.
I wanted to say that you really won't find a better PC with such a display. That's all..
A lot of people don't understand until you try it yourself. If you create graphics that are more demanding on hardware, I recommend that when you are in the Czech Republic, you go to iStyl and try it, they will give you the software you want to try if you bring it with you. Then we'll have fun. You mustn't compare how, say, Pohotoshop runs on an assembly with windows and on an assembly with osx, it's simply not possible...
The Radeon R9 M295X graphics is from this year, slightly faster than the GeForce GTX 980M, which is currently the sixth fastest card on the market.
Really? Can you link to a link where the M295X is faster than the GTX980M? Then I saw everywhere that it was exactly the opposite.
http://notebook.cz/clanky/technologie/2014/amd-radeon-r9-m290x-stale-stejny-kral-mobilniho-segmentu-amd
Repair: iMac 5K with 4GB VRAM, 16GB RAM and 4GHz processor for an incredible 88.190 CZK. Well, don't buy it :) I really prefer Mac Pro Hexa Core :)
Dell will launch a 5K monitor in December, priced at $3000. Apple has a computer priced at $2400….
That's still bullshit. I'm surprised that none of the engineers here have been working at Apple for a long time and are not advising those morons who (as I read here) do not understand it at all..
I applaud, bravo, this is what it should look like! :)