Hacker Newsnew | past | comments | ask | show | jobs | submit | CorrectHorseBat's commentslogin

Median doesn't say anything about the extremes and income isn't wealth.


Bicycle bells can be used to warn other cyclists


Not if they wear thick or noise-cancelling headphones.


The real question is, why do we even need this? Why don't VHDL and Verilog just simulate what hardware does? Real hardware doesn't have any delta cycles or determinism issues due to scheduling. Same thing with sensitivity lists (yes we have */all now so that's basically solved), but why design it so that it's easy to shoot in your own foot?


What do you mean by simulate? Do you want the language to be aware of the temperature of the silicon? Because I can build you circuits whose behaviour changes due to variation in the temperature of the silicon. Essentially all these languages are not timing aware. So you design your circuit with combinatorial logic and a clock, and then hope (pray) that your compiler makes it meet timing.

The fundamental problem is that we're trying to create a simulation model of real hardware that is (a) realistic enough to tell us something reasonable about how to expect the hardware to behave and (b) computationally efficient enough to tell us about a in a reasonable period of time.


"All sensors are temperature sensors, some measure other things as well"


The only way to simulate what real hardware does is to synthesise the design, get a net list and do a gate level simulation. This is incredibly slow, both to compile and to simulate.

You could, of course, simplify the timing model a lot. In the end you get down to “there is some time passing for the signal to get through this logic, we don’t know how much but we assume it’s less than any clock period”.. in which case we end up with delta cycles.


Real hardware has clock trees. Wouldn't all (most?) problems with delta cycles go away if the HDL understood the concept of clocks and clock balancing?


> Why don't VHDL and Verilog just simulate what hardware does?

Real hardware has hold violations. If you get your delta cycles wrong, that's exactly what you get in VHDL...

They're both modeling languages. They can model high-level RTL or gate-level and they can behave very different if you're not careful. "just simulation what the hardware does" is itself an ambiguous statement. Sometimes you want one model, sometimes the other.


Draw yourself an SR latch and try simulating. Or a circuit what is known as „pulse generator“


Both SystemVerilog and VHDL have AMS extensions for simulating analog circuits. They work pretty well but you also pay a pretty penny for the simulator licenses for them.


Those are analog circuits, if you put them in your digital design you are doing something wrong.


dont know if trolling. SR latch you can do with 2 NANDs, or NORs there are plenty of *digital* circuits with that functionality, and yes, there are very rare cases when you construct this out of logic and not use a library cell for this. pulse circuit is AND(not(not(not(a))),a) also rarely used but used nonetheless. to properly model/simulate them you would need delta cycles


I'm not sure if you are trolling. 99.999% of digital design is "if rising edge clk new_state <= fn(old_state, input)", with an (a)sync reset. The language should make that the default and simple to do, and anything else out of the ordinary hard. Now it's more the other way around.


All circuits are analog when physically realized, the digital view is an abstraction.


I’m not exactly sure what you’re getting at, but I think I’ve had a similar question: why don’t HDLs have language elements more representative of what digital circuits are constructed from, namely synchronous and asynchronous circuits, rather than imperative input triggered blocks (processes IIRC, it’s been a while)?

I always thought it was confusing to design a circuit mentally (or on paper) out of things like muxes, encoders, flip flops, etc. and not have language-level elements to represent these things (without defining your own components obviously).

I remember looking this up, and I believe it’s because the languages were originally designed for simulation and verification, and there are things you might want to do in a simulation/verification language for testing that are outside of what the hardware can do. Mixing the two is confusing IMO, but clearly demarcating the hardware-realizable subset of the language would be better than the current state.


Because it's both slow and terrible?

You generally do not want to simulate or describe raw gate-level netlists. Both languages are capable of that. Old school Verilog (not SystemVerilog) is still the defacto netlist exchange format for many tools.

It's just aggravatingly slow to sim and needlessly verbose. Feeding high-level RTL to Verilator to do basic cycle-accurate sim has exceptionally fast iteration speed these days.


Is it really if you restrict yourself to sensible design practices? You generally want to simulate simple clocked Logic with a predefined clock, most of the time anything else is a mistake or bad design. So just if rising edge clk next_state <= fn(previous_state, input) . It seems to me VHDL and verilog are simply at the wrong abstraction level and by that they make simulation needlessly complicated and design easy to do wrong. To me it seems that if they had the concept of clocks instead none of this would be necessary and many bugs avoided (but I'm no expert on simulator design, so I might be missing something...)


I agree basically with everything you're saying, but that's not arguing for raw gate netlists. If anything it's arguing for even higher levels of abstraction where clock domains are implicit semantic contexts.

Many new school HDLs are working in this space and they couldn't be farther from the "representative of what digital circuits are constructed from" idea. Often they're high-level programmatic generators, very far from describing things in terms of actual PDK primitives.


In a way is further away, but in another way it's actually closer to how real hardware works: Clock (and reset) trees are real physical things which exist on all digital chips.


Don't you need to register and actively defend you trademark for it to apply?


There are unregistered trademarks as well as registered ones. Usually the "TM" symbol is applied to unregistered trademarks, and the ® symbol for registered ones. Both enjoy protection, although it's generally an easier time in court when your trademark is registered.

Whether actively defending your trademark is actually required is a bit of a nuanced topic. Generally, trademarks can be lost through genericide (the mark becomes a generic term for the type of product) or abandonment. Abandonment happens when either the mark owner stops using the mark itself, or takes an action that weakens the mark. The question, then, is whether failing to defend infringing use constitutes a weakening action. Courts differ on this, and there is a large gray area between "we didn't immediately sue a local mom-and-pop shop" and "we allowed a rival company to use the mark erroneously across several states for years without taking action."


In this case, the name is already so generic that you might even be denied a trademark in the first place.


I think we'll be soon at the point where articles are written by asking AI to extend a three point bullet list to 30 pages, and read by asking AI to summarize articles into a three point bullet list.


This drives me nuts. It's been going on for years that a simple "if this, do that" deal is encoded in an overly elaborate 10 minute long YouTube video where at least 9 minutes of it is filler. You know, when you start skimming the comments to see if anyone bothered with summarizing it.

AI amplifies the problem by making it easier to produce filler, but the problem is whatever metrics are behind the monetization. You need users to "engage" with your content for at least x amount of time to earn y amount of money, while instead the earnings should be relative to and directly derived from how useful the content is to how many users.


Another possible benefit I've heard of is it can stop some kinds of voter intimidation:

Someone gets hand of an empty ballot, they fill in the ballot and give it to you and tell you to come back with another empty ballot. Rinse and repeat. Of course, with today's smartphones there are simpler ways to do this. Also moot if you can vote by mail, which is why voting by mail is a really bad idea.


there's a much simpler solution for in person voting, used in Italy for example: you have a numbered sticker on the ballot, when you get the ballot the sticker id is written down, when you get out of the voting booth it gets verified and detached.

The ballot you throw in the box does not have the sticker (vote is anonymous) and you cannot come in with a pre-marked ballot (and being out one) cause the number would not match.


Sorry I do not understand how this would work. Can you explain in more detail?


If the maintainers are already bandwidth limited, how is first asking annoying questions not also a drain on that bandwidth?


I can understand drive-by features can be a net burden, but what is wrong with a drive-by bugfix?


I configure it in the firmware of my keyboard with QMK


>Search is currently unavailable when logged out

Do you have any specific links?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: