I'm really not a professional video editor or even hardcore hobbyist, I have very minimal basic knowledge that I've learned on my own in the past few years. However, I have a question in mind that interests me...
I record video game consoles and I'm planning on archiving the gameplay as videos.
I would record in 1080p from consoles, but the capture card (Blackmagic Design Intensity Shuttle) only supports up to 1080p 30 which means, the program will not recognize the source since, I believe, is sent in 1080p 59,94.
The next highest resolution is 1080i 59,94 recorded in AVI 10-bit YUV and the comes the 720p.
However, since the 1080i is interlaced, it will look awful on the computer screen and needs to be deinterlaced.
Now the question is, does the deinterlaced 1080i look as good as the "native" 1080p if the process is done correctly?
I've done some searching and people seem to give both answers that it will look as good, and then there are these who say it will not due to artifacts and 'jitter' (I have no idea what is the later one).
Also, if I choose to use the 1080i, is it wise to render it into 29,97 frames since it is interlaced?