Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Your most important tool in Space Imaging...
#1
Your most important tool in Space Imaging....
after your brain that is...

a bad ass computer.

Makes all the difference in the world.

I've always tried to keep on the upper edge with both soft and hardware-
For an example here is what I'm running now

MSI MoBo
AMD Vishera 4300 FX quad core 4 ghz CPU
512 gb SSD, 2- 1 TB HDDs
16 gb RAM,
Radeon HD 7870 graphics
27" AOC Monitor.
CoolerMaster Tower- Fans galore- cooling is a must

built all that from components and obtained on sale for a total of about 700$.

You should see Mars the way I see it.

Smoke
On a satellite I ride. Nothing down below can hide.
Reply
#2
I have lost 2 acer netbooks one acer full laptop one gateway full laptop and a 10 inch proscan tablet  in the last month.

I am on a new android tab as I type this and hate this tiny little device.

and by LOST I mean

hold and catch fire events.

I  can't wait to get a new pc.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#3
EA my bought Acer is having blue-screen issues at the most annoying times lately. 

It seems my bought "kit" I put together, is a bit more tougher than the bought factory produced one.  :thwack:

Unfortunately all the bought software is upstairs on the Acer put together one I am using now while backing website from an internal 2TB to a 3TB external.

Would hate the time and trouble to install all that on the downstairs one or move it up here and re-install all over again.

Back-ups and back-ups ... never have too many. Guitar

Bob... Ninja Reefer
"The Light" - Jefferson Starship-Windows of Heaven Album
I'm an Earthling with a Martian Soul wanting to go Home.   
You have to turn your own lightbulb on. ©stevo25 & rhw007
Reply
#4
Sounds like you need a new one.
I started building my own computers when the factory built units I'd used started dying. When replacing components I saw how cheaply they were made. Factory systems are a rip off. Most I ever got out of one is a couple years before they gave up the ghost. They're generally cobbled together out of the cheapest low grade parts, and most struggle with games or doing large scaled imaging.
I bought and built this one out of the best components I could find in my price range.  The total cost was right at 650$. It can handle the largest Mars images without a pause, and can play all my favorite games on highest graphics settings. It's WAY easier than you'd think to build one.
On a satellite I ride. Nothing down below can hide.
Reply
#5
Hi, I use

Water cooled Dual core and Quad core Apple PowerPCs (Immense imaging power with  High-redundancy.. )
These are also used on MSL Curiosity - G5s) I was using these units before MSL was built and came as a pleasant surprise she has G5s at her core..
Ipads and Iphones  all linked with AppleTV for free roaming traverse examination on 4K TV.
aLL protected behind a linux server (4TB). I use an ASUS X50GL for out and about plus Iphone ... Screen captures are sent back to server for later perusal on PPCs.
Software used ( SHHHH!).
Very big tip in computing imaging power.... Alienware- Ironically named. I have a area 51 unit as backup...
Cheers
TW

.....“from one thing, know ten thousand things”
Miyamoto Musashi,

Reply
#6
(07-24-2014, 02:12 AM)EA Wrote: I have lost 2 acer netbooks one acer full laptop one gateway full laptop and a 10 inch proscan tablet  in the last month.

I am on a new android tab as I type this and hate this tiny little device.

and by LOST I mean

hold and catch fire events.

I  can't wait to get a new pc.

What I meant by LOST is probably and most likely now revealed as  lincoln et all with  malware and viruses.

Bastard!!!  Gangup
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#7
Novel Super-Resolution Restoration Technique Shows Mars in Incredible Detail  Holycowsmile
Apr 26, 2016 by Editors

S.P.I.T. gets a REBOOT! LilD

The Martian surface — including the location of ESA’s Beagle-2 lander, and the ancient lakebeds discovered by NASA’s Curiosity rover — has been shown in unprecedented detail by planetary researchers at University College London (UCL).
[Image: image_3818_1-Beagle-2.jpg]
Beagle-2 landing site: original image (upper panel) and SRR from 5 input images (lower). The bright object in the upper centre portion is shown in the next figure. Map co-ordinates come from NASA’s HiRISE camera and are not in a global reference system. Image credit: Yu Tao & Jan-Peter Muller, University College London.

According to Prof. Jan-Peter Muller from the UCL Mullard Space Science Laboratory and Yu Tao, a researcher at UCL, the technique they developed — called Super-Resolution Restoration (SRR) — could be used to search for other artifacts from past failed landings as well as identify safe landing locations for future rover missions.
A paper describing the SRR technique was published in the February issue of the journal Planetary and Space Science.
“We now have the equivalent of drone-eye vision anywhere on the surface of Mars where there are enough clear repeat pictures,” Prof. Muller said.
“It allows us to see objects in much sharper focus from orbit than ever before and the picture quality is comparable to that obtained from landers.”
“As more pictures are collected, we will see increasing evidence of the kind we have only seen from the three successful rover missions to date. This will be a game-changer and the start of a new era in planetary exploration,” he said.
For cameras orbiting Earth and Mars, the resolution limit today is around 10 inches (25 cm).
By stacking and matching pictures of the same area taken from different angles, SRR allows objects as small as 2 inches (5 cm) to be seen from the same 25-cm telescope.
For the Red Planet, where the surface usually takes decades to millions of years to change, these images can be captured over a period of ten years and still achieve a high resolution.
For Earth, the atmosphere is much more turbulent so images for each stack have to be obtained in a matter of seconds.
[Image: image_3818_2-Beagle-2.jpg]
Zoom-up of the proposed Beagle-2 location (left panel) at the original 25 cm resolution; zoom-up of the SRR of proposed lander location (center panel) at 6.25 cm; cartoon sketch of Beagle-2 superimposed on the right of the proposed lander location at the same scale on SRR (right panel). Image credit: University College London.

Tao and Prof. Muller applied SRR to stacks of between four and eight 10 inch images of the Martian surface taken using the High Resolution Imaging Science Experiment (HiRISE) — a camera on board NASA’s Mars Reconnaissance Orbiter — to achieve the 2 inch target resolution. These included some of the latest images of the Beagle-2 landing area.
“Our technique has huge potential to improve our knowledge of a planet’s surface from multiple remotely sensed images,” Tao said.
“In the future, we will be able to recreate rover-scale images anywhere on the surface of Mars and other planets from repeat image stacks.”
The team plans on exploring other areas of Mars using the SRR technique to see what else they find.
_____
Y. Tao & J.-P. Muller. 2016. A novel method for surface exploration: Super-resolution restoration of Mars repeat-pass orbital imagery. Planetary and Space Science, vol. 121, pp. 103-114; doi: 10.1016/j.pss.2015.11.010

Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#8
RE: Your most important tool in Space Imaging...
Quote:These structures could therefore have been the first astronomical tools to support the watching of the skies, millennia before telescopes were invented.

Prehistoric tombs enhanced astronomical viewing
June 30, 2016

[Image: prehistorict.jpg]
The megalithic cluster of Carregal do Sal. Credit: University of Nottingham


Astronomers are exploring what might be described as the first astronomical observing tool, potentially used by prehistoric humans 6,000 years ago. Holycowsmile



Read more at: http://phys.org/news/2016-06-prehistoric-tombs-astronomical-viewing.html#jCp


They suggest that the long, narrow entrance passages to ancient stone, or megalithic, tombs may have enhanced what early human cultures could see in the night sky - an effect that could have been interpreted as the ancestors granting special power to the initiated.
The team, led by Nottingham Trent University, presents its study at the National Astronomy Meeting, being held this week in Nottingham.
Their idea is to investigate how a simple aperture, for example an opening or doorway, affects the observation of slightly fainter stars. They focus this study on 'passage graves', which are a type of megalithic tomb composed of a chamber of large interlocking stones and a long narrow entrance. These spaces are thought to have been sacred, and the sites may have been used for rites of passage, where the initiated would spend the night inside the tomb, with no natural light apart from that shining down the narrow entrance lined with the remains of the tribe's ancestors.

These structures could therefore have been the first astronomical tools to support the watching of the skies, millennia before telescopes were invented.

Kieran Simcox, a student in Nottingham Trent University's School of Science and Technology, and leading the project, said: "It is quite a surprise that no one has thoroughly investigated how for example the colour of the night sky impacts on what can be seen with the naked eye."
The project targets how the human eye, without the aid of any telescopic device, can see stars given sky brightness and colour. The team intends to apply these ideas to the case of passage graves, such as the 6,000 year old Seven-Stone Antas in central Portugal.
Dr Fabio Silva, of the University of Wales Trinity Saint David, said: "The orientations of the tombs may be in alignment with Aldebaran, the brightest star in the constellation of Taurus. To accurately time the first appearance of this star in the season, it is vital to be able to detect stars during twilight."
The first sighting in the year of a star after its long absence from the night sky might have been used as a seasonal marker, and could indicate for example the start of a migration to summer grazing grounds. The timing of this could have been seen as secret knowledge or foresight, only obtained after a night spent in contact with the ancestors in the depths of a passage grave, since the star may not have been observable from outside. However, the team suggest it could actually have been the result of the ability of the human eye to spot stars in such twilight conditions, given the small entrance passages of the tombs.
The yearly National Astronomy Meetings have always had some aspects of cultural astronomy present in their schedules. This is the third year running where a designated session is included, exploring the connection between the sky, societies, cultures and people throughout time.
The session organiser over the past three years, Dr Daniel Brown of Nottingham Trent University, said: "It highlights the cultural agenda within astronomy, also recognised by the inclusion of aspects of ancient astronomy within the GCSE astronomy curriculum."
[Image: 1x1.gif] Explore further: Archaeo-astronomy steps out from shadows of the past
Provided by: University of Nottingham


Read more at: http://phys.org/news/2016-06-prehistoric-tombs-astronomical-viewing.html#jCp
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#9
(06-18-2014, 10:07 PM)Keith Wrote: Your most important tool in Space Imaging....
after your brain that is...

a bad ass computer.

Makes all the difference in the world.

'Nyquist sampling theorem' and does it apply to Mars Imagery?

Neural networks promise sharpest ever images
February 22, 2017

[Image: 1-neuralnetwor.png]
The frames here show an example of an original galaxy image (left), the same image deliberately degraded (second from left), the image after recovery with the neural net (second from right), and the image processed with deconvolution, the best existing technique (right). Credit: K. Schawinski / C. Zhang / ETH Zurich.
Telescopes, the workhorse instruments of astronomy, are limited by the size of the mirror or lens they use. Using 'neural nets', a form of artificial intelligence, a group of Swiss researchers now have a way to push past that limit, offering scientists the prospect of the sharpest ever images in optical astronomy. The new work appears in a paper in Monthly Notices of the Royal Astronomical Society.



The diameter of its lens or mirror, the so-called aperture, fundamentally limits any telescope. In simple terms, the bigger the mirror or lens, the more light it gathers, allowing astronomers to detect fainter objects, and to observe them more clearly. A statistical concept known as 'Nyquist sampling theorem' describes the resolution limit, and hence how much detail can be seen.


The Swiss study, led by Prof Kevin Schawinski of ETH Zurich, uses the latest in machine learning technology to challenge this limit. They teach a neural network, a computational approach that simulates the neurons in a brain, what galaxies look like, and then ask it to automatically recover a blurred image and turn it into a sharp one. 

Quote:Your most important tool in Space Imaging....
after your brain that is...

a bad ass computer.

Makes all the difference in the worlds if you blend/meld that thought? Sharper Cydonia and Microscopic Rover image enhances?
[Image: seven-earth-size-exoplanets-discovered-6...576-hp.gif]

Just like a human, the neural net needs examples - in this case a blurred and a sharp image of the same galaxy - to learn the technique.

Their system uses two neural nets competing with each other, an emerging approach popular with the machine learning research community called a "generative adversarial network", or GAN. The whole teaching programme took just a few hours on a high performance computer.
The trained neural nets were able to recognise and reconstruct features that the telescope could not resolve - such as star-forming regions, bars and dust lanes in galaxies. The scientists checked it against the original high-resolution image to test its performance, finding it better able to recover features than anything used to date, including the 'deconvolution' approach used to improve the images made in the early years of the Hubble Space Telescope.
Schawinski sees this as a big step forward: "We can start by going back to sky surveys made with telescopes over many years, see more detail than ever before, and for example learn more about the structure of galaxies. There is no reason why we can't then apply this technique to the deepest images from Hubble, and the coming James Webb Space Telescope, to learn more about the earliest structures in the Universe."
Professor Ce Zhang, the collaborator from computer science, also sees great potential: "The massive amount of astronomical data is always fascinating to computer scientists. But, when techniques such as machine learning emerge, astrophysics also provides a great test bed for tackling a fundamental computational question - how do we integrate and take advantage of the knowledge that humans have accumulated over thousands of years, using a machine learning system? We hope our collaboration with Kevin can also shed light on this question."
The success of the project points to a more "data-driven" future for astrophysics in which information is learned automatically from data, instead of manually crafted physics models. ETH Zurich is hosting this work on the space.ml cross-disciplinary astrophysics/computer-science initiative, where the code is available to the general public.
[Image: 1x1.gif] Explore further: Astronomers find faintest early galaxies yet, probe how the early universe lit up
More information: Kevin Schawinski et al, Generative Adversarial Networks recover features in astrophysical images of galaxies beyond the deconvolution limit, Monthly Notices of the Royal Astronomical Society: Letters (2017). DOI: 10.1093/mnrasl/slx008 
Journal reference: Monthly Notices of the Royal Astronomical Society [Image: img-dot.gif] [Image: img-dot.gif] Monthly Notices of the Royal Astronomical Society Letters [Image: img-dot.gif] [Image: img-dot.gif]
Provided by: Royal Astronomical Society



Read more at: https://phys.org/news/2017-02-neural-net...s.html#jCp[/url]



Quote:Your most important tool in Space Imaging....
after your brain that is...

a bad ass computer.

Makes all the difference in the world.

Computing with biochemical circuits made easy

February 23, 2017



[url=https://3c1703fe8d.site.internapcdn.net/newman/gfx/news/hires/2014/dna.png][Image: dna.png]

From left to right, the structures of A-, B- and Z-DNA. Credit: Wikipedia
Electronic circuits are found in almost everything from smartphones to spacecraft and are useful in a variety of computational problems from simple addition to determining the trajectories of interplanetary satellites. At Caltech, a group of researchers led by Assistant Professor of Bioengineering Lulu Qian is working to create circuits using not the usual silicon transistors but strands of DNA.




Read more at: https://phys.org/news/2017-02-biochemical-circuits-easy.html#jCp
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#10
I wonder if Keith could get even better results using the 2nd generation of Moc-o-matic 2 which ( I believe he still uses as major noise remover ) and then uses his own "magic" with the brightness, contrast, color, and other modifications.  But the 2nd generation of the Moc-o-Matic was "crucial" at one time doing ANY MSSS images.

Keith Hmm2


Bob... Ninja Alien2
"The Light" - Jefferson Starship-Windows of Heaven Album
I'm an Earthling with a Martian Soul wanting to go Home.   
You have to turn your own lightbulb on. ©stevo25 & rhw007
Reply
#11
actually Bob I haven't used it in years ..
the images don't have the same type of noise in them now as the MOCs did.
but the things I'm using now to do basically the same thing are FAR more advanced and efficient.
On a satellite I ride. Nothing down below can hide.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)