Does Ethernet Do Error Correction
Contents |
Tour Start here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company error detection and correction Business Learn more about hiring developers or posting ads with us Network Engineering Questions
Error Detection And Correction In Computer Networks
Tags Users Badges Unanswered Ask Question _ Network Engineering Stack Exchange is a question and answer site for network engineers. Join them; it hamming code error detection only takes a minute: Sign up Here's how it works: Anybody can ask a question Anybody can answer The best answers are voted up and rise to the top Is Ethernet considered a reliable protocol? up
Error Detection And Correction In Data Link Layer
vote -2 down vote favorite Ethernet uses a sliding window to resend lost frames (error correction). It uses a sequence number to account for unordered frames (error correction). It has a frame check sequence to discard corrupted frames (error detection). Does this makes it "reliable"? ethernet share|improve this question asked Jul 17 '14 at 14:30 compguy24 1021 6 Your information is wrong. Ethernet doesn't resend lost frames nor does it use sequence numbers. checksum error detection example As such the basis for the question is invalid unless you would care to edit the question. –YLearn♦ Jul 17 '14 at 15:13 add a comment| 2 Answers 2 active oldest votes up vote 6 down vote Not even remotely. There are no mechanisms for signaling a dropped frame (or why it was dropped -- CRC error, too small ("runt"), too big, no buffer space), thus there is no means to know what needs to be resent. There are no sequence numbers in ethernet frames -- the payload may contain one. The only time ethernet will resend a frame is when the transmitter knows a collision occurred. But with 99.99999% of gear being full duplex today, collisions never happen. BTW, this is why iSCSI SANs use special switches with very large internal, per-port buffers. And why FCoE (fibre channel over ethernet) uses special hardware. Both require a reliable transport, and ethernet on it's own isn't. share|improve this answer edited Jul 17 '14 at 17:39 answered Jul 17 '14 at 16:56 Ricky Beam 16k11849 add a comment| up vote 0 down vote Ethernet is not a protocol - in the OSI model, Ethernet would be in the physical layer - there are no protocols there (there may be specifications for how long the wires can be, how thick, etc.), but thos
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow
Crc Error Detection
the company Business Learn more about hiring developers or posting ads with us Stack
Error Detection And Correction Pdf
Overflow Questions Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow is a community of error correction techniques 4.7 million programmers, just like you, helping each other. Join them; it only takes a minute: Sign up What if CRC bits have error in Ethernet Frame? up vote 1 down vote favorite The Ethernet Frame http://networkengineering.stackexchange.com/questions/9998/is-ethernet-considered-a-reliable-protocol consists of 32 CRC (Cyclic Redunancy Check) bits for checking errors. Won't there be a huge problem if the CRC bits themselves are changed but the message/payload is correct? Is there a way to detect, avoid and correct this? networking ethernet crc share|improve this question edited Oct 19 '12 at 5:46 user1500049 730514 asked Oct 19 '12 at 5:35 Vaibhav Agarwal 441620 add a comment| 2 Answers 2 active oldest votes up http://stackoverflow.com/questions/12967953/what-if-crc-bits-have-error-in-ethernet-frame vote 2 down vote accepted No matter crc itself is good or bad, as long as it doesn't match/verify payload (even the payload is still good), this ethernet frame is considered as having a crc error and should be dropped at layer2. From what I know, in general we don't do any error "correction" with ethernet. Besides, crc is for error detection, not correction. However, mainly upper layer protocol (say TCP) is responsible for reliable delivery and does what it should do to retransmit. share|improve this answer edited Oct 19 '12 at 6:09 answered Oct 19 '12 at 5:59 user1500049 730514 'Even if the payload is still good' is meaningless. Wrong CRC means something is wrong in the packet. You can't know what it is. –EJP Oct 19 '12 at 20:49 @EJP "Even if the payload is still good" was referring to OP's question at the end saying "...but the message/payload is correct?" Not a statement I made. I know it's meaningless cuz we can't tell it's good. –user1500049 Nov 8 '12 at 0:35 add a comment| up vote 2 down vote If the crc of the message does not match the crc after the message, then you only know that there are errors somewhere in the combination of the
body, IEEE document, physical layer description System block diagrams Error Control Codes Constellations Modulation Pulse shape Dealing with Multiple Users On-going standardization/ Companies Intro - The following excerpt from Ethernet: The Definitive Guide (O'Reilly http://ecee.colorado.edu/~ecen4242/ethernet/ethernet.html and Associates, 2000), explains where the name "Ethernet" comes from: “In late 1972, Metcalfe and his Xerox PARC colleagues developed the first experimental Ethernet system to interconnect the Xerox Alto, a personal http://arstechnica.com/civis/viewtopic.php?p=6271817 workstation with a graphical user interface. The experimental Ethernet was used to link Altos to one another, and to servers and laser printers. The signal clock for the experimental Ethernet interface was error detection derived from the Alto's system clock, which resulted in a data transmission rate on the experimental Ethernet of 2.94 Mbps. Metcalfe's first experimental network was called the Alto Aloha Network. In 1973 Metcalfe changed the name to "Ethernet," to make it clear that the system could support any computer--not just Altos--and to point out that his new network mechanisms had evolved well beyond the Aloha error detection and system. He chose to base the name on the word "ether" as a way of describing an essential feature of the system: the physical medium (i.e., a cable) carries bits to all stations, much the same way that the old "luminiferous ether" was once thought to propagate electromagnetic waves through space. Thus, Ethernet was born.” [4] A Twisted pair 10BASE-T Cable is used to transmit 10BASE-T Ethernet Typical Applications - Ethernet is a large and diverse family of frame-based computer networking technologies for local area networks (LANs). It defines a number of wiring and signaling standards for the physical layer, two means of network access at the Media Access Control (MAC)/Data Link Layer, and a common addressing format. Ethernet has been standardized as IEEE 802.3. Its star-topology, twisted pair wiring form became the most widespread LAN technology in use from the 1990s to the present, largely replacing competing LAN standards such as coaxial cable Ethernet, token ring, FDDI, and ARCNET. In recent years, Wi-Fi, the wireless LAN standardized by IEEE 802.11, has been used in addition to or instead of Ethernet in many installations [3]. Ethernet is used for LANs everywhere, from yo
Case and Cooling Fetish CPU & Motherboard Technologia Mobile Computing Outpost Networking Matrix Other Hardware Agora Classifieds Ars DIY Forum (Name TBD!) Operating Systems & Software Battlefront Microsoft OS & Software Colloquium Linux Kung Fu Windows Technical Mojo Distributed Computing Arcana Macintoshian Achaia Programmer's Symposium The Server Room Ars Lykaion Gaming, Extra Strength Caplets The Lounge The Soap Box The Boardroom The Observatory Ars Help & Feedback Ars Subscription Member Areas Image Galleries Is there any error correction across an HDMI cable 27 posts Perrin49 Ars Praetorian Registered: Jul 19, 2002Posts: 509 Posted: Tue Sep 05, 2006 11:19 am I know HDMI is digital, so I assumed that there would be some sort of error correction built in.The cable would either work, or it wouldn't, like bad Satellite TV signals - the picture is there, then it isn't (or it is suddenly, obviously stuttering).So I recently read that my assumption is wrong, there is no error correction in HDMI. Do quality TVs attempt any compensation for this, or with the higher bit rate of video does a bit here and a bit there really matter? ZPrime "HA-HA! Not to scale." Ars Legatus Legionis et Subscriptor Tribus: Cleveland (Mayfield Heights), OH Registered: Jan 22, 2000Posts: 16418 Posted: Tue Sep 05, 2006 11:21 am there's no error correction.one bit in a stream of info isn't going to make a huge difference in the display unless it's in a header/protocol block or something. I don't know the technical details of how HDMI is encoded (other than that it is similar to DVI) so I can't be more helpful than that. Amyd Ars Praefectus Registered: Jan 17, 2004Posts: 4961 Posted: Tue Sep 05, 2006 11:41 am HDMI uses a BCH error correction code for the packets. reboot Ars Legatus Legionis Registered: Feb 16, 2000Posts: 10858 Posted: Tue Sep 05, 2006 11:48 am q