Leaderboard


Popular Content

Showing content with the highest reputation since 03/21/19 in Posts

  1. 2 points
    JColvin

    Read from MicroSD in HDL, Write on PC

    Hi @dcc, I'm not certain how you are verifying that the HDL is writing to and then reading back from the SD card in a normal formatting style, but in general FAT32 is a widely used format for SD cards that has existing material for it. I am uncertain why you are using a special tool to write to the SD card though; from what I can tell the tool is Windows compatible, so why not just use the Notepad program which comes with Windows and save a .txt file with the data you are interested in reading to the SD card or just using Windows Explorer (the file manager) to move the file of interest onto the SD card? If you do have a header in your file, you will need to take account for that, though I do not know what you mean by "random file" in this case. Thanks, JColvin
  2. 2 points
    SeanS

    Genesys 2 DDR Constraints

    Hi JColvin, I am definitely not using ISE. I think JPeyron had it correctly. I didn't have my board.Repopaths variable set and so the project wasn't finding the board files. Once I set this variable as suggested, the pin mapping and IO types were auto populated as expected. Kudos, Sean
  3. 2 points
    @jpeyron @D@n I fixed the bug in my SPI Flash controller design. Now I can read from Flash memory.
  4. 2 points
    Hi @Blake, I was struggling with the same problem. In Adam's project is mistake which result is an FMC-HDMI module is not recognizable by other devices. The reason for that is not sending EDID at all. The cause of this situation is wrong initialized EDID map. In Adams example EDID is initialized by: but the correct way is: the body of iic_write2 is from LK example: By the way, in LucasKandle example initialization is done in same way as in Adam's example so is the reason why it not worked in your case. I hope it will helps. If you want I will post my working code for a ZedBoard with FMC-HDMI when I clean it because at the moment is kind of messy.
  5. 1 point
    @askhunter I suggest that you read UG479 to see what the DSP48E blocks do. Then read UG901 to see what the use_dsp attributes do. Reading the recipe doesn't always help improve the cooking but it never hurts. A long time ago having signed multipliers in hardware was a big deal for FPGA developers. For the past decade or so these have become integrated into more complicated and useful 'DSP' blocks. The DSP nomenclature is a holdover from the days, long before IEEE floating point hardware was available, when having a fast multiplier in hardware meant that you could do some fun stuff in a micro-controller that you couldn't do with software routines. These days the lines are blurry. Most FPGA devices have some really fast hardware features, block ram and DSP blocks ( depending on how they are used ) being the most useful for grinding out mathematical algorithms. By the way, the DSP blocks can be useful for more than multiply-add operations.
  6. 1 point
    jpeyron

    Pmod IA AD5933 measurement speed

    Hi @M.Mahdi.T, Welcome to the Digilent Forums! We typically do not test the max throughput metric when validating our products. Looking at the AD5933 datasheet here the Pmod IA would be limited to 1 MSPS due to the on board ADC. The main bottleneck for throughput to a host board will be the I2C communication which runs at 400 KHz. Here is a good forum thread for using the Pmod IA with the Raspberry PI with code and hints for use , setup and calibration. Here is the resource center for the Pmod IA which has the Raspberry PI code as well. best regards, Jon
  7. 1 point
    @Ahmed Alfadhel To understand what's going on, check out table 8 of the datasheet on page 15. Basically, the DAC provides outputs between 0 and max, where 0 is mapped to zero and all ones is mapped to the max. In other words, you should be plotting your data as unsigned. To convert from your current twos complement representation to an unsigned representation where zero or idle is in the middle of the range, rather than on the far end, just toggle the MSB. Dan
  8. 1 point
    Hi, I think a UART is the least effort. Parsing ASCII hex data in a state machine is easy and intuitive to debug, at the price of 50 % throughput. If you like, you can have a look at my busbridge3 project here, goes up to 30 MBit/second. The example RTL includes very simple bus logic with a few registers, so it's fairly easy to connect an own design. Note, it's not meant for AXI, microblaze or the like as it occupies the USB end of the JTAG port. In theory, it should work on any Artix & FTDI board as it doesn't any LOC-constrained pins.
  9. 1 point
    bogdan.deac

    OpenCV and Pcam5-c

    Hi @Esti.A, SDx, which includes SDSoC (Software Defined System on Chip), is a development environment that allows you to develop a computer vision application, in your case, using C/C++ and OpenCV library. The target of SDx-built applications are Xilinx systems on chip (SoC) (Zynq-7000 or Zynq Ultrascale+). Xilinx SoC architecture has two main components: ARM processor (single or multi core) named Processing System (PS) and FPGA, named Programmable Logic (PL). Using SDx to build an application for SoC allows you to choose which functions from your algorithm are executed in PS and which ones are executed in PL. SDx will generate all data movers and dependencies that you need to move data between PS, DDR memory and PL. The PL is suitable for operations that can be easily executed in parallel. So if you are going to choose a median filter function to be executed in PL, instead of PS, you will obtain a better throughput from your system. As you said, you can use OpenCV to develop your application. You have to take into account that OpenCV library was developed with CPU architecture in mind. So the library was designed to obtain the best performance on some specific CPU architectures (x86-64, ARM, etc.). If you are trying to accelerate an OpenCV function in PL using SDx you will obtain a poor performance. To overcome this issue, Xilinx has developed xfopencv, which is a subset if OpenCV library functions. The functionalities of xfopecv functions and OpenCV functions are the same but the xfopencv functions are implemented having FPGA architecture in mind. xfopencv was developed in C/C++ following some coding guideline. When you are building a project, the C/C++ code is given as input to Xilinx HLS (High Level Synthesis) tool that will convert it to HDL (Hardware Description Language) that will be synthetized for FPGA. The above mentioned coding guideline provides information about how to write C/C++ code that will be implemented efficiently in FPGA. To have a better understanding on xfopencv consult this documentation. So SDx helps you to obtain a better performance by offloading PS and by taking advantage of parallel execution capabilities of PL. Have a look on SDSoC documentation. For more details check this. An SoC is a complex system composed by a Zynq (ARM + FPGA), DDR memory and many types of peripherals. Above those, one can run a Linux distribution (usually Petalinux, from Xilinx) and above the Linux distribution, the user application will run. The user application may access the DDR memory and different types of peripherals (PCam in your case). Also, it may accelerate some functions in FPGA to obtain a better performance. To simplify the development pipeline Xilinx provides an abstract way to interact with, named SDSoC platform. SDSoC platform has two components: Software Component and Hardware Component that describes the system from the hardware to the operating system. Your application will interact with this platform. You are not supposed to know all details about this platform. This was the idea, to abstract things. Usually, the SDSoC platforms are provided by the SoC development boards providers, like Digilent. All you have to do is to download the last SDSoC platform release from github. You have to use SDx 2017.4. You don't have to build your own SDSoC platform. This is a complex task. You can follow these steps in order to build your first project that will use PCam and Zybo Z7 board. The interaction between PCam and the user application is done in the following way: there is an IP in FPGA that acquires live video stream from the camera, the video stream is written into DDR memory. This pipeline is abstracted by the SDSoC platform. The user application can access the video frames by Video4Linux (V4L2). The Live I/O for PCam demo shows you how to do this. I suggest you to read the proposed documentation to obtain a basic knowledge needed for SDSoC projects development. Best regards, Bogdan D.
  10. 1 point
    jpeyron

    Arty A7 USB mechanical USB problem

    Hi @acm45, I reached out to one of our design engineers about this thread. They responded that: "there are no series resistors between the D+ and D- pins of the connector and the those pins on the FT2232HQ. I think it would be very difficult to solder the D+ and D- wires directly to pins of the USB controller. My suggestion is to purchase an external JTAG programmer/debugger (JTAG-HS2) and attach it to header J8." best regards, Jon
  11. 1 point
    Hi @learni07, Can you please check the baud rate from Tera-Term? For this, click on Setup->Serial port and check Baud Rate down-menu. It has to be 115200. Best Regards, Bogdan Vanca
  12. 1 point
    attila

    Wait for averaging in script

    Hi @Phil_D Spectrum.run() for(var i = 0; i < 50 && Spectrum.wait(); i++); Spectrum.stop()
  13. 1 point
    Hi @Phil_D @bvleo The issue is fixed in the latest beta version. You can call: subprocess.Popen(['C:/Program Files (x86)/Digilent/WaveForms3/WaveForms.exe','default.dwf3work','-runscript']) When a WF app instance is running, the above call will notify it to load and run the script. The '-runscript' will supress the 'save current workspace' message.
  14. 1 point
    Hi jon! I have resolved that issue using a case statement to assign the bits instead of using the - operator.Hence I have deleted the post also immediately.where you still able to see it?
  15. 1 point
    tommienator

    Pmod DA3

    Hey everybody, A small question. I need to generate a signal of 28.8MHz (sine wave), so in FPGA I've build a DDS and it's generating a nice sine wave @ 27.799MHz (close enough :p). For connecting this to a certain target board I've bought the PMOD DA3 (https://store.digilentinc.com/pmod-da3-one-16-bit-d-a-output/). There is only one small problem that I can't figure out... I know when you're sampling you have to comply to Shannon's theory, but how is it with DACs when you are reconstructing a signal, do they also have to comply to this? I've dived into the schematic of the PMOD DA3 and saw that a chip from Analog Devices is used, more specific the AD5541A. When looking into the datasheet, I saw that it connects with a 50-MHz SPI-/QPSI-/... interface, meaning that SCLK can take a maximum of 50MHz when a voltage supply of 2.7 - 5.5V is applied... Now the question that I have is, does a DAC need to comply to Shannon's theorem, meaning that the absolute max I can generate with this DAC is a 25MHz signal (in ideal conditions), or does a DAC not have to comply with this and I can just generate the signal without any problem :)? Thank you :)!
  16. 1 point
    Hi @Phil_D I just notice that the -runscript is only working when the instruments are in docking window mode. My bad, runscript is only looking for docked Script window. It will be fixed in the next software version.
  17. 1 point
    Hi @Phil_D There is no zero padding option but it can be done with Script like this: var rg = Spectrum.Channel1.data // channel 1 time domain data var c = rg.length var t = rg[c-1] // last sample for(var i = 0; i < c; i++) rg.push(t) // 2x padding var rghz = Spectrum.Trace1.frequency var hz = 2.0*rghz[rghz.length-1] //var hz = 2.0*Spectrum.Frequency.Stop.value // scope sample rate Spectrum.Trace5.setSamples(rg, hz) Some other suggestions to improve the resolution: 1. For lower frequencies, with 1MHz sampling you can use the Scope to perform a longer recording. This will highly improve the resolution in the FFT view. 2. With AD you can select the second device configuration to have 16k Scope buffer. 3. You can select a higher bandwidth window, like rectangular or cosine. 4. In the latest beta version with CZT algorithm you can select higher number of bins, higher resolution. https://forum.digilentinc.com/topic/8908-waveforms-beta-download/ Here: - T1 is CZT BlackmanHarris 10x BINs, 244Hz resolution - T2 is FFT BlackmanHarris 4k BINs 2.4kHz resolution - T3 is FFT Cosine 4k BINs 2.4kHz resolution
  18. 1 point
    jpeyron

    VGA on Zybo

    Hi @Mukul, Here is a VHDL VGA project that has pixel clock frequencies for multiple resolutions. Here and here are non-digilent VGA tutorials. Here is a listing for different pixel frequencies and resolutions. best regards, Jon
  19. 1 point
    Hey @Phil_D I too had a problem with getting WaveForms to run automatically, and my group didn't find a way to make it work through the suggested code. However, we did have success when we commanded Python to simulate an F5 key press, which is a shortcut in WF to run the script. We used the library "uinput". Here's a sample of the actual command we used. I'm not certain that it's all you need, as I was not the primary Python programmer for this project, but at least it'll give you an idea of what needs to be done for this workaround. The sleep timers are there to give the program time to start and load the workspace. waveform_Call = subprocess.Popen("exec " + waveforms, shell = True) waveform_Call time.sleep(10) device = uinput.Device([ uinput.KEY_FN, uinput.KEY_F5, ]) time.sleep(1) device.emit_combo([ uinput.KEY_FN, uinput.KEY_F5, ])
  20. 1 point
    jpeyron

    Hello world program for zynq linux

    Hi @Ram, I moved your thread to a section where more experience embedded linux engineers look. best regards, Jon
  21. 1 point
    Hello Jon! Thank you for giving hope in coding with what ever I am comfortable with.
  22. 1 point
    jpeyron

    ZYBO HDMI IN project

    Hi @birca123, I would suggest to start with a fresh sdk portion of the project. To do this close SDK and delete the hdmi-in.sdk folder in the \Zybo-hdmi-in\proj folder. Then in vivado click file->export -> export hardware including the bitstream. Then launch SDK from Vivado by clicking in file and selecting launch SDK. Once in SDK and the HW platform is loaded click in file and import HDMI_IN and HDMI_IN_bsp from Zybo-hdmi-in\sdk. Once you have the HDMI_IN and HDMI_IN_bsp into your SDK project then program the FPGA Next open a serial terminal emulator like tera term and connect to the Zybo's com port. Set the baud rate to 115200 everything else should be left at default settings. Now connect the Zybo to the HDMI and VGA device. then in SDK right click on HDMI_IN and select run as->launch on hardware(system debugger). Do you see the serial terminal menu? Is there an image on the VGA device?
  23. 1 point
    jpeyron

    ZYBO HDMI IN project

    Hi @birca123, The easiest why to use a newer version is load the project in the intended version , in this case Vivado 2016.4 and then open it in the desired Version of Vivado. Then upgrade the ip's and alter any xdc pin names that might have changed when the HDMI IP's were updated. Here is a project where I have upgraded the project to Vivado 2018.3. The project can be found in the proj folder. I did not validate that the project worked with SDK and Monitors but the Vivado project generated a bitstream. So you should only need to export the hardware including the bitstream launch SDK and import the HDMI applications. best regards, Jon
  24. 1 point
    JColvin

    Read out .log file from OpenLogger

    Hi @Peggy, I spoke with some of the firmware folks for WFL and OpenLogger and learned that they haven't yet implemented the parsing of the header into the Digilent agent yet. I did receive a picture that showed the structure of the header file, which I have attached below. Thanks, JColvin
  25. 1 point
    I see, thanks jpeyron
  26. 1 point
    JColvin

    Pmod ISNS20 SPI Arduino

    Hi @tfcb, I am taking a look into this; I connected a level shifter of my own (Digilent's Pmod LVLSHFT rather than the Sparkfun one you linked to) to connect an Arduino Uno and Pmod ISNS20, but I too am getting strange values (no initial offset for example), so I'm debugging some more. Thanks, JColvin
  27. 1 point
    Ciprian

    Qustions about arty-z7 HDMI out demo

    Hi @Mingfei, If you are referring to Arty-Z7-XX-hdmi-out, where XX is either 10 or 20, demo then you have series of solutions to make it work faster. Before I go in to detail regarding this let's talk a bit what the demo actually does. It basically reads one frame (dispCtrl.framePtr[dispCtrl.curFrame]) and then takes every pixel and inverts it in to the output frame (dispCtrl.framePtr[nextFrame]) this is a software approach (based on how DemoInvertFrame() is written) which means that it will depend on the processor how fast it can manage the data. Therefor stuff like printing text on the console (like printf("Volage.....)) will slow it down. As for the solutions there are 2 way to try and speed it up: 1. Software, VDMA has a feature called frame buffer parking you can read all about it in the User Guide (PG020 for v6.3), it basically parks the output frame on one buffer and lets you edit the other 2 frames without interrupting the video out stream. This will increase your out put to 50 fps but the refresh rate of what you want to do, the actual processing, will still only work at 5 frames. 2. Hardware, you could take advantage of your FPGA and write a IP core which does the inversion in HW thus offloading the task from the processor and getting to almost no processing delay; this of course means you will have to redesign your project. I would recommend writing the IP in HLS because it's easier and placing it at the output stage be teen the VDMA and the Subset Convertor. -Ciprian
  28. 1 point
    jpeyron

    OpenCV and Pcam5-c

    Hi @Esti.A, We have a reVISION project here for the Zybo-Z7-20 that uses SDSoC and the OpenCV that should be useful for your project. best regards, Jon
  29. 1 point
    If you want to set both pins at the same time, rather than in two separate statements, you could also do this: LATGSET = (1 << 15) | (1 << 6); // or 0b1000000001000000 These parts have registers that can do an atomic set/clear/invert, so that you don't have to do a read/modify/write of the register.
  30. 1 point
    Hi, >> We are forced to work in assembly with picoblaze. you might have a look at the ZPU softcore CPU with GCC. The CPU is just a few hundred lines of code but most of its functionality is in software in the crt.o library in RAM. I understand it's quite well tested and has been used in commercial products. Not surprisingly, using an FPGA to implement a processor that then kinda emulates itself in software (aka RISC :) ) is maybe not the most efficient use of silicon - I'm sure it has many strong points but speed is not among them... Unfortunately, the broken-ness of Xilinx' DATA2MEM utility (to update the bitstream with a new .elf file) spoils the fun, at least when I tried in ISE14.7 (segfaults). When it works, the compile/build cycle takes only a second or two. Long-term, porting the processor to a new platform would be straightforward, or even fully transparent if using inferred, device-independent memory. This would also work for a bootloader that is hardcoded into default content in inferred RAM. I might consider this myself as a barebone "hackable" CPU platform strictly for educational purposes.
  31. 1 point
    Heya Jon, Now it seems resolved. And I can move to other questions to Digilent (you can see that Xilinx thread you 've referred too) Thanks for your help on this.
  32. 1 point
    bogdan.deac

    Inter core communication

    As far as I know the NN training phase takes long time and needs many resources. For this reason it is not recommended to train NN on FPGAs. On the other hand, FPGA is strong in inference. I advise you to use GPU and a learning framework, like Caffe, for the training phase. Fortunately, Xilinx released recently a new development kit for NN named Deep Neural Network Development Kit (DNNDK). Here you have the user guide and the DNNDK extension for SDSoC. Have a look on the Xilinx documentation and forum posts to get familiar with all concepts. Let us know if you have any questions.
  33. 1 point
    Hi @callum413 Tomorrow beta build will support time stamp identified by the software. High precision timer is not available in the device.
  34. 1 point
    Hi @Luighi Vitón, I'm not sure it's a HDMI issue, have you tried to use X11 over SSH do you get the same result? -Ciprian
  35. 1 point
    stever

    Simultaneous Pattern and Data Capture

    OK, I found a Rate setting after unhiding advanced options so I now get the same number of samples in the LA. Thanks guys for the swift support Cheers stever
  36. 1 point
    Hi @askhunter, The top.vhd is already added to the project. If you are wanting this file to be underneath the design_1 then you should right click on the design_1 and select add sources. Then add the vhdl files you would like to add to the design. It might be easier to start with a fresh project. best regards, Jon
  37. 1 point
    jpeyron

    Arty A7 vs Nexys A7

    Hi @Phil, The Arty-A7-100T comes with with a built in USB JTAG/UART (J10) programming circuit.You will not need an additional JTAG programmer like the JTAG-HS2 to configure the Arty-A7-100T. Here is the Arty A7 resource center. Under Additional Resources there is a tutorial called getting started with microblaze servers that goes through making an ethernet echo server with the Ethernet lite IP Core (no cost). best regards, Jon
  38. 1 point
    vicentiu

    build error petalinux

    It looks like you took our petalinux project and added your hdf file to it? System-user.dtsi makes references to axi which was present in our Vivado design in the petalinux project, but you don't have axi in your design. Please remove all the references it's complaining about. Regarding a Petalinux "HOWTO", Xilinx's petalinux documentation in the best reference: https://www.xilinx.com/support/documentation/sw_manuals/xilinx2017_4/ug1144-petalinux-tools-reference-guide.pdf https://www.xilinx.com/support/documentation/sw_manuals/xilinx2017_4/ug1156-petalinux-tools-workflow-tutorial.pdf https://www.xilinx.com/support/documentation/sw_manuals/xilinx2017_4/ug1157-petalinux-tools-command-line-guide.pdf You'll find most documentation about system-user.dtsi in the first link, and some in the second. Make sure you're looking at the guide for the Petalinux version you are using.
  39. 1 point
    jpeyron

    Nexys 2 - transistor part number

    Hi @CVu, Welcome to the Digilent Forums! Q1 information is below: NTS2101P Single P-Channel Power Mosfet 1.4A, 8VSOT-323 (SC-70) best regards, Jon
  40. 1 point
    zygot

    NexysDDR4 example projects

    OS file permissions are a two-edged sword. It can prevent users from changing stuff that shouldn't be changed but it can prevent users from doing their work unintendedly This is a user issue. You will have to learn how to change file permissions as a computer user to the extent that your privileges allow. Depending on the OS and how security is set up this can be a pain, especially when transferring files form one OS or computer to another one. If you have an IT department they should be able to help resolve issues. If you are the IT department then you need to learn how to set up and use your OS safely and securely.
  41. 1 point
    kwilber

    Pmod DA3 clocking

    Inside the AD5541A, the MOSI bits get clocked into a shift register and are held there until the ~CS line goes high. At that time, the bits are transferred from the shift register to the D/A. It does not matter what level is on MOSI at that instant. In the traces I posted earlier, I included a transition from full scale output to 0. I also show several cycles of writing all possible values in a ramp. The resulting voltage waveform shows the AD5541A is seeing the data correctly. The last four writes to the pmod in the zoomed trace show sending the values 0, 1, 2 and 3 to the D/A. You can observe SCLK's transition in relation to the least significant bits of the data. SCLK is not transitioning when ~CS transitions to high so the data on MOSI is "don't care" at that instant. I did use different clocks since the microblaze can run at higher clock rates than the AD5541A. Also, when you are troubleshooting, it can sometimes help to slow down the logic. I see you are using pmod connector JB whereas my project used JA. Just as a test, you might want to try moving your PmodDA3 to JA and use my project as is to replicate my results. You should be able to launch vivado, open my project then immediately launch the SDK from vivado. You should not have to generate a bitfile. I had included the hardware handoff in the zip file I gave you so you have my exact bitfile. Once the SDK loads, it should automatically load the project and compile it. At that point you can program the fpga from inside the SDK and then run my example app. You should see a sawtooth waveform coming out of the PmodDA3 if all is well.
  42. 1 point
    Hi @Nithin Yes. See the AnalogIO_DigitalDiscovery.py how to adjust the IO voltage, enable VIO output, set pull-up/downs, slew rate, drives The DigitalIO functions can be used the same way controlling DIO[24:31] The DigitalOut functions are the same but you have 32k custom bit/line for DIO[24:31] The DigitalIn by default samples DIN[0:23]&DIO[24:31]. To sample DIO lines first DIO[24:39]&DIN[0:15] set FDwfDigitalInInputOrderSet(hdwf, true). This way if you sample 16 bits you will get DIO[24:39]. The base frequency is 800MHz (FDwfDigitalInInternalClockInfo), like use FDwfDigitalInDividerSet(hdwf, 8 ) for 100MHz
  43. 1 point
    Hi, >> applying various algorithms if you don't know the algorithms yet, it might be easier to get the ADC data into a softcore CPU, then prototype in C using floating point. The point is, experimental prototyping of algorithms in fixed point RTL is a slow and painful process.whereas reloading a .elf binary takes a second. I'm guessing your intent but I suspect you'll find eventually that an FPGA is not the optimal platform choice for your application. Meaning, you can most likely get the same result cheaper and with less effort e.g. using a Raspberry PI + SPI converter. I'm guessing this simply because higher-rate ADCs where FPGA makes sense (hundreds of MHz) are hard to come by as OTS modules. Otherwise, if you design for, say, 1 MSPS, the FPGA fabric will do less than 1 % of the work it could do but you pay for 100 % so people usually don't use FPGA, if a CPU or DSP will do the job.
  44. 1 point
    kwilber

    NEXYS 3 frequency meter

    The problem is likely in the .ucf file where you define pin information. The error message says device pin LL8 doesn't exist. If you post the contents of your ucf, we can probably figure it out.
  45. 1 point
    @Jonathon Kay My recommendation is to start with this board and follow tutorials from here. Download and install recommended Vivado versions not the latest. This board is very affordable and has everything to start. Tutorials will give you a picture of the design process with Vivado. It will help to see integration of the FPGA part in the system. Udemy Zynq tutorials are applicable to this board with minor corrections. I wish to have this when I started. It might be hard but it is a lot of fun as well and without fun who would do it.
  46. 1 point
    Hi @ezadobrischi, Welcome to the Digilent Forums. Please be more specific on the Photo Diode output. Based on basic Photo Diode output I would: 1) Convert the photo diode receiver current output to voltage and use an ADC to read the voltage signal. 2) The Nexys 4 DDR has and on-board XADC (xilinx analog to digital converter) 7 Series FPGAs and Zynq-7000 SoC XADC Dual 12-Bit 1 MSPS Analog-to-Digital Converter User Guide. Here is our XADC demo for the Nexys 4 DDR done in Verilog. 3) The voltage input range is 0v to 1v in unipolar mode and -.5v to .5v for bipolar mode for the on-board XADC. 4) If the voltage is not within the 0v -1v then I would either use a level shifter circuit to bring the analog signal into the 0-1v range or use something like the Pmod LVLSHFT. 5) If the voltage is in the 3.3v - 5v range and do not want to use use a level shifter you can use other ADC's like the Pmod AD1. Once you have the signal in the Nexys 4 DDR you can filter the signal. cheers, Jon
  47. 1 point
    It seems that something went wrong when preparing first partition with BOOT.BIN and image.ub files. First method : A minimal test that we can do is to prepare only one FAT32 partition and copy BOOT.BIN and image.ub files. Then start the board with this setup. This way it should only start the Linux kernel, however no rootfs. The UART should be initialize. and the green LED should be on. Step 1: format SD card as FAT32 file system Step 2: copy BOOT.BIN and image.ub files to the SD card (then safety remove the SD card) Step 3: insert SD card in zybo-z7-20 and power the board Step 4: The green LED turns on then it means that the UART(via FTDI) is ready. Note: Alternative: The second file attached was z7-20.img.zip which is a snapshot off a 2GB SD card containing both partitions. The image can be written to the micro SD card with `dd` tool on Linux or a similar tool for Windows (writing this image to SD card from windows 10 worked for me using Win32DiskImager). Step 1: extract the z7-20.img.zip Step 1.A : insert micro SD card if not already inserted. Step 2: run Win32DiskImager Step 3: For `Image File` field set the path to z7-20.img file. z7-20.img sha1sum: e08516edb24ff65d32ce7a43a946f0be9b9f0ebe (If you want to check) Step 4: In `Device` field make sure the Drive letter for microSD is selected (If not you risk loosing data from other devices). Step 5: Pres write.(If you are prompted to format partitions please do not. The second partition filesystem is ext4 which is not recognized by the OS) Step 6: After the process ended, safety remove the micro SD. Step 7: Plug the micro SD into the z7-20 board. Power up the board.
  48. 1 point
    jpeyron

    pmod wifi

    Hi @harika, Using the Pmod WIFI's HTTP server you do not need to connect a PC to the router. You will need a device on the routers network to access the HTTP server. I used a WIFI enabled device like a smart phone and logged onto the routers network then from an internet browser on my smart phone I accessed the specific IP address of the HTTP server. thank you, Jon
  49. 1 point
    jpeyron

    DAC Pmod DA3 on Spartan 3 Starter Kit

    Hi @lwew96, Here is a VHDL Pmod DA3 project done by a community member @hamster that is made in vivado with the Basys 3. To make this project work with the spartan 3 you will need to use the xdc file as a reference for the spartan 3's ucf file. cheers, Jon
  50. 1 point
    @Ajay Ghodke, For being new, you've picked a pretty challenging project! Reading an image from an SD card requires several layers of processing. The first part is the physical layer. The specification outlines one of two means of communicating with a cards physical layer: using the SDIO protocol (a four-wire, bidirectional data/command channel), and using the SPI protocol. I have personally built a Verilog core that communicates over the SPI protocol, and you are welcome to use my core, while other's have done their work with SDIO and ... I haven't tried their cores so I cannot speak to it. Across this physical layer, and to get things up and running, a protocol handshake must take place between the card and the controller. This handshake is outlined in the physical specification above, and also in the instructions for the controller I just mentioned. For a given card, the handshake isn't all that difficult. Indeed, for a given class of cards it's probably not that bad. To process every possible SD card, finding appropriate examples to test with can be a challenge--just to prove that you can handle all the corner cases. Once you can handle communications at this level, you will then be able to read and write sectors from the SD card. The next level, beyond sectors, involves reading and understanding the partition table. This table will tell you where the file systems are on the SD card, and more specifically, what file systems are on the SD card. In general, most SD cards have only one file system on them so partition processing is pretty simple. That file system is a FAT filesystem--whether FAT16, FAT32, etc. I'm not certain. (I haven't gotten this far yet.) After the partition layer, you will need to process the file system. Assuming that your SD card has a FAT filesystem on it, there will be two types of files on the system: file containing directory entries, and other. These files may be found in any order on the SD card, as specified by the file allocation table. That table contains one entry per cluster on the file system, telling you that after you read the given cluster, where to find the next one. (Clusters are similar to sectors, but may be implemented as groups of sectors.) If the filesystem is in proper order, the last cluster will identify itself as the last cluster. So, the steps to processing this filesystem involve: Identifying which version of the FAT system you are working with Finding, from that information, the first cluster of the root directory Reading through the directory for the file you want. (Keep in mind, other clutsers to this directory may be out of order--you'll need to look up their locations in the table.) If your file is in a subdirectory, you'll have to find the subdirectory entry. Once you have the file (subdirectory) you want, you'll find a location on the disk where that file begins. (IIRC, it's the cluster number of the first sector of the file) You'll also find the length of the file (or subdirectory) you are interested in. If what you found was a subdirectory, and if it's the subdirectory your file is in (assuming it is in a subdirectory and not the main directory), you'll then need to repeat this process, reading the subdirectory "file" and looking for your file's entry. (Or perhaps looking for another subdirectory entry.) From the final entry, you will now know where to find the first cluster of your file, and the full length of the file in bytes. (It may be longer or shorter than the number of clusters allocated for it in the allocation table.) The file allocation table will tell you where to find subsequent clusters. If all you wish to do is to change the image in place, then you now know where to find all the pieces. At this point, you can change the file as it exists on the SD card at will. Creating new files on the SD card requires creating a directory entry that file, find empty clusters to hold the file, placing the number of the first cluster into the directory, adjusting the directory length, etc. It doesn't take long and this becomes an in-depth software task. I have seen some approaches where individuals have created their own partitions containing their own file system with their own format just to avoid all of this hassle, and to be successful doing this within a microcontroller. While doable, such solutions tend to be application specific. Hope this helps, Dan